Hacker News new | past | comments | ask | show | jobs | submit login

It feels like yet another way for Google to offload the hard work of indexing the web into other people.

See also; microdata, and the monthly alerts I get from Google about the content of my sites being malformed, yet they validate just fine in a dozen other tools.

If this keeps up, eventually we'll have to log in to Google to provide it with with the URLs of new content, and fill in all of the meta data about each page. All in the name of whatever buzzword Big G comes up with that month.




>If this keeps up, eventually we'll have to log in to Google to provide it with with the URLs of new content, and fill in all of the meta data about each page

FWIW, most people who care about their search ranking would absolutely love a tool like this. The point of the automated crawl isn't just to find the information, it's because if you let people submit their own content to the index they'll lie about what it is.


> hard work of indexing the web

IMO if you mean that Google should check linked-to content rather than relying on link qualifiers, Google has been doing a bad job in recent years, as I can't find useful material among an ocean of low-effort clickbait most of the time. OTOH, relying on metadata by publishers won't solve this problem.


Dare I say that a better indexed web is a public good that we all benefit from immensely.

Logging into Google and providing it with the URLs isn't analogous because that only benefits Google, it provides a barrier to the entry of competition, which isn't good for us users.


> Dare I say that a better indexed web is a public good that we all benefit from immensely.

Sure, if it were used for the benefit of the public, rather than the benefit of Google.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: