Hacker News new | past | comments | ask | show | jobs | submit login

I think this article actually agrees with you. The very first NOTE is this:

""" Note: Be cautious about using data that was primarily created for insurance purposes. Often, it's not truly reflective of patient's condition but rather encompassing for billing / profit. Luckily, there are clinical reports, like radiology, diagnostic imaging, pathology reports, etc., that are intended for physician use and are more reflective of true patient conditions. Unfortunately, most of this data is not readily available in APIs because it's largely unstructured. This is a ripe space for ML to take raw, unstructured data and produce structured, computable data. """




Insurance is just a one part of the problem.

Large practices have treatment standards on which physicians are evaluated. Reporting side-effects might be politically inconvenient in some cases. Medicine is also, like other human endeavors, subject to fashions and fads.

At best, applying machine learning to health records will generate a hypothesis that must be checked in a properly controlled trial.


So it’s not a “terrible idea”?


Many organizations have been attempting to use AI to extract coded concepts for years. Even the best systems have a relatively high error rate, so if you want to use the output for anything important you still need a trained human medical coder to fix the errors. But that AI rough draft does have value since correcting its errors is still faster than entering all the codes manually.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: