For the purposes of their research and study, web scraping is stable enough, given that many websites (especially government sites), aren’t overhauled frequently. And many government sites, like those run by coroners, aren’t likely to have APIs.
Contacting the agency for the data is always a good step, but even if they are responsive, they may not be willing to email you on a daily basis with data updates.
That is a good argument, and I should have mentioned it, yes. For a one-off job, web scraping will probably be the best choice, and maybe even the fastest to implement. I have done my own share of web scraping for personal projects (and thus know about the fragility), but I didn't care much about broken results in the long run.
But in the article they mentioned re-running their program to update their data, so it could be a long-term effort. And anyone planning a long-term project reading that article and taking their advice should at least be warned about this possible problem.
We used to despise web scraping and tried to dump data from an industrial system by listening to raw Modbus RS-485 traffic. Turned out that there was no way to get the register maps from the vendor nor the firmware.
We ended up writing a scraper for the main unit. Once they have been installed and configured, they're likely not going to be updated unless absolutely necessary (such as when adding new, unsupported controllers to the bus).
Contacting the agency for the data is always a good step, but even if they are responsive, they may not be willing to email you on a daily basis with data updates.