Hacker News new | past | comments | ask | show | jobs | submit login

I get that the scale of this problem is dazzling but think that fundamentally you do not have a solution if you are copying pagerank (even adding upvotes etc.) for queries that haven't been screened by your staff.

I think you need to take a good hard look at what makes for shitty content and build some parameters off that. And if you had to go off pagerank (to begin with) I would be trying stuff like adding hidden penalties to popular CMSes, boosting reddit/HN content, and following SEO trends just to thwart them. I would categorize websites by expertise so queries related to korea do not rank pages from the hindustantimes. When you search for air filters, you should probably get that housefresh team and not forbes etc. The upvote system you have is moving in the right direction but even that will need to be fortified with anti-seoer measures.

I would try to create real EEAT standards that cannot be gamed without massive investments.

It's too late to undo the damage that the Danny Sullivans of the world have done but maybe can save something here.




All we need is a search quality report as I indicated before and our team will look into it. The fact that there is no or little spam for other things Kagi users care about, is a testiment to our determination to deal with it.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: