Hacker News new | past | comments | ask | show | jobs | submit login

I think this is a feature, and not a bug.



Agreed. Every time I get the "unknown or expired link" error, I'm grateful that it didn't show me a slightly inaccurate (according to some metric/specification) list.

/sarcasm


It could be intentional, but every time I click "More" I expect to see more news items, the rank of any of those items in that specific moment in time doesn't really matter to me. I think it'd be more "natural" to show items ranked from 31 to 60 or something similar to that approach...


You are probably right - At first I was frustrated, having to rescan articles starting at the top, but then I decided it reminds me I am spending too much time reading HN and not enough time doing...

-Edited to be more accurate of the inside of my head/thought process


This is the old "Twitter downtime is actually a feature" BS from 2010, but for HN.


How so?


As far as I understand, the generated "more" link will take you to the second page of the articles where no articles of the first page are going to appear even if there should be pushed back to the second page because some new articles were posted in the mean time.

Think of an always up-to-date list divided in two pages, clicking on the second page would take you to a list of links that you might already have seen on the first page. Here, the second page is kind of "frozen" in time, aka the session and only displays the rest of the links.

I might be wrong though.

What I don't know is why a "news.ycombinator.com/page/2" url design couldn't be handled with a client-side session and cookie and still offers the same feature.


I'd much rather see some repeated submissions, rather than an empty broken page. But that's just me.


I always thought it was designed like that to slow down scrapers.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: