Hacker News new | past | comments | ask | show | jobs | submit login

This does seem to be the goal of the project. They mention that they want to make a centralized control panel and crawler, where you can control how your website is crawled and when

But at the end of the day the resulting dataset would be sold to third parties so they can rank the results appropriately. Which to me seems to be the only sane way forward. Only a government could run something the scale of the Google crawler and succeed at doing so. And then everyone can build search engines on top of that.




> And then everyone can build search engines on top of that.

That bit troubles me. If the index is maintained by a government agency, and every search engine is using the same index, then that's a massive censorship avenue. I wonder how "open" Open Web Index is going to be.


If the index is censored, nothing stops you from adding to the index yourself.

The vast majority of the net is, after all, now indexed, so you can run your own indexer to cover whatever they didn't cover.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: