Hacker News new | past | comments | ask | show | jobs | submit login

Sounds good to me. I've had very little engagements (good or bad) on things I've submitted in the past that I thought should have been interesting to at least a few people. It seems there is a lot of content falling through that just isn't timed right to get somebody to push the up vote button. I had 1 upvote on a link I submitted last weekend. I suspect nobody actually saw the link or clicked on it and that this was not based on any merit. I've also had links submitted with no action only to see the same link submitted by somebody else making it to the front page much later.

Duplicate links are actually interesting since they are so easy to detect (identical link). Why not simply aggregate metrics for those things? The important thing is the link making it to the front page without creating a lot of duplicates. Somebody submitting a duplicate would effectively become an up vote for the already submitted link. Views and duplicate submissions are possibly more significant signals than up votes.

In search precision and recall are the two metrics people use for judging search quality. It's important to realize that hacker news is effectively a ranking algorithm and therefore is a search engine; even though it delegates actual search to Algolia. It sacrifices recall for precision: everything on the front page should be relatively high quality. But that's at the price of potentially high quality things never reaching it (recall).

Of course, with only 30 slots on the front page, there's only so much that can be on it. Especially if you consider that many users only drop by once or twice a day or so. So those slots stay occupied for quite long as well. Days in some cases. The choice as to what is right is highly subjective (i.e. the moderators decide) and biased towards the intentions of the site and that's intentional. But that doesn't mean it can't be improved.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: