Hacker News new | past | comments | ask | show | jobs | submit login

Too bad there's not something like DOI or ARK [0] available for anyone to use to give documents a searchable, permanent ID so that a location can be maintained. IME, the half-life of many URLs (5-10 years?) makes them unreliable. I recently was unable to find (by URL) an entire historical collection at a major southern US university until I discovered that it had been moved to a new server.

[0] https://arks.org/about/




> the half-life of many URLs (5-10 years?) makes them unreliable.

"Simple" enough experiment, on this very site: Use the "past" feature on the main menu to go back a step at a time, and tally the number of broken links from the external submissions the further you go back.-

The amount of dead projects, expired domains, broken links, 404s, etc. is sad.-


Someone did that and found that only 5% of submitted URLs (200k/4M) were dead: https://blog.wilsonl.in/hackerverse/

Submitted 3 months ago: https://news.ycombinator.com/item?id=40307519


Thanks for the data. That's actually not bad. Then again, 5% of a "webscale" number of sites is still a lot.-

Thanks for pointing to that study ...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: