Hacker News new | past | comments | ask | show | jobs | submit login

I get it, but in this case it is the DOCTOR basically thinking "stupid users". Nobody was endangered, the patient was just put in a heck of a lot of pain because the doctor assumed they were lying or wrong and it wasn't even worth checking, because their pain didn't matter that much to the doctor.

The hospital is a PRIME place for systems meant to avoid mistakes. (Although your example with checklists and pilots is odd; the checklist IS such a system, a remarkably successful one. And over the last 10 years, is used more often in hospitals too. I guess you could say that every checklist should have TWO people check it off or something, but it's such successful technology at reducing mistakes I'm not sure I've seen this suggested).

But no system can make up for doctors who don't respect patients or care about their pain.




Sorry I was unclear. The distinction between "blame the pilot" and "determine root causes, so we can work to prevent recurrence", is relatively familiar in an aviation context. So say, a recurrent checklist failure, might prompt checklist improvement, rather than merely blaming pilots. Perhaps shortening a landing checklist, by moving items to a less tense approach checklist.

So I attempted to illustrate by analogy, that describing a common reason why something happens, isn't being apologetic, or a distraction. Here, it seems possible that head-down engagement in physical manipulation, might have distracted from patient communication and care management. As that flavor of failure is not uncommon.

Regards tech to reduce mistakes, it will be interesting to see AR attempt to blend medical records with point of care. To see the patient, is to see their status. To administer an injection, is to have it recorded. But yeah, culture change is core, and hard. And tech doesn't yet provide great leverage on that. Or sometimes, as with VA OR teams composed by randomized assignment, something non-ideal has been knowingly chosen for its larger-scale properties, and the tech task is to mitigate negative impact. By for instance, detecting when meds were requested, but absent a team familiar with working together, didn't happen. There was a VA trauma surgeon shopping around a request to implement that years back.


> head-down engagement in physical manipulation, might have distracted from patient communication and care management.

No, a spade is a spade, the patient clearly said the Doctor didn't _believe_ him.

This isn't an in-the-zone 'tunnel vision' moment from the doctor, this isn't high stress shutting off the outside world, or CSR type reaction. This is someone _assuming_ a patient was telling a lie, who then lacked the duty-of-care to confirm it, before inflicting unnecessary pain.

I am all for checklists, they are a big part of my work processes and they are desperately needed in many medical and nursing workflows, but this particular case is just lack of common sense and duty-of-care.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: