> paywalls that leave ways for readers to work around them
I would call pasting-the-URL-into-Google-Search less of a intentional workaround and more of a trick to take advantage of the websites' compliance with Google rules.
Not every HN reader would know to do that, or look in the comments for that "workaround."
That's right, so it's ok for people to ask and share how to read an article in the comments. There shouldn't need to be more than one or two comments about this, and it helps everyone focus on the content.
What's off-topic is the generic tangent of paywall complaining.
If an article is from a brand-new source that uses a kind of paywall you've never encountered before, then hopefully someone will comment with a workaround. If not, you can ask. Maybe you could even subscribe to publications you like, read regularly, and want to support.
Since almost all paywalled articles are from the WSJ, the Economist, or the NYT, this shouldn't happen to you very often.
Eventually you learn these things, from living life (including reading HN for awhile). And if you can't read an article because there's no workaround, or because you don't know of a workaround:
- Just don't read the article.
- Subscribe. If you can't/won't afford it, then see above, or see below.
- Search for other sources of the information. And post them, it adds to the discussion. Most articles worth taking up space, particularly on paywalled sites, are worth that space in other venues. Almost nothing is exclusive, not after a day anyway.
In the WSJ case, I've noticed that yahoo often prints the article verbatim.
That's one option. You can also google the title or URL of the article (this is the most common workaround); or you can search the comments for the word 'paywall'; or you can purchase a membership or subscription for the paywalled site; or you can skip reading the article.
So all I need to do is try every possible option? And even then it may fail (scientific journals, newspaper archives, etc)?
The links are just huge wastes of time. A prominent tag attached to the article would be ok, but in the absence of any other feature to avoid these time sinks, it makes sense to flag the articles to save others from additional wastage.
This site has a graph that say 55% of visitors to gaming sites use adblock http://contently.com/strategist/2015/07/10/why-adblockers-sh... . I would assume hacker news visitors would have a similar number. That would be an interesting thing to measure. Someone who gets to the front page of HN should measure what percent of people with a HN referer block ads.
I've thought about trying to solve this problem with software, but it feels like a line we probably shouldn't cross. Hence the current answer: it's fine for users to help each other read articles. That seems unimpeachable, whereas having HN officially undermine paywalls seems like a Schrödinger can of worms if not a classical one.
The news sites have made the economic calculation that allowing access to traffic from content aggregators like Google (which is the price of being discoverable by Google) is worthwhile.
The idea that only sufficiently large aggregators/traffic sources should get a special pass seems preposterous; anyone trying to enforce would be engaged in downright anticompetitive behavior.
The cat is already dead, can we please open the box & acknowledge the source of the foul smell?
Maybe you can automatically put paywall bypass instructions in the "TEXT" portion of the URL submissions?
edit: The auto-generated bypass instructions will get the top-sorted/top-comment favoritism that we normally try to avoid from users.
If sites don't want people to bypass paywalls, then they would not allow "special" ways to bypass paywalls. The fact that some paywalls have special referrer bypass rules reeks of financially motivated favoritism and entrenched interests preventing competition; the next search engine startup to be created is going to have a rough time of it.
I would call pasting-the-URL-into-Google-Search less of a intentional workaround and more of a trick to take advantage of the websites' compliance with Google rules.
Not every HN reader would know to do that, or look in the comments for that "workaround."