> that they're underpaid or overworked or whatever
Having family in healthcare and being a lifelong patient due to a congenital defect, a job in software (including management obv) is a breeze compared to health care worker jobs across the board. They require more "schooling" (via mind-numbing training), are underpaid, have much more serious consequences from mistakes, have less oversight, work longer hours and under worse conditions. It's disheartening.
People's work shouldn't be reduced, but drugs and devices should be to an extent. Firefighters told the portable EKG costed thousands of dollars (5 or 10 don't recall). That thing had bad 2G connectivity so the graph coudn't be sent to a cardiologist. The sensor/dsp part is worth money, the rest would cost 100$ from any smartphone maker today with better performance and portability.
I'm sure the price is rigged for margins. This shouldn't be a market, it should be an open duty to make these cost efficient.
The real cost is not so much in the hardware but in the certification requirements. Medical devices need to be built to and work at a higher standard than consumer electronics, for obvious reasons.
But if the cost becomes so high that people simply go without the devices, is that really a good tradeoff? Maybe a device with 1 in 10,000 failures is better than no device at all, for some subset of illnesses?
Also, I suspect this is FUD by some entrenched medical devices companies who know how to play the game of complying with standards. Anyone here ever gone through compliance testing with software? I've seen systems pass that are worse than what I would have just thought up in 30 minutes and coded up for fun. It's just a warm fuzzy feeling.
1) certification that takes 10 years (more like 3, but ...) means you, at the very best, have 10 year old technology
2) Certified at a higher level. That can for various reasons be very different from reliable at a different level. For instance, circumstances change or knowledge advances.
For instance certification tends to take "proof" that something works. Yet the most reliable robots are pretty bad hardware, with the ability for multiple components to do the same job "most" of the time. Such a device, despite being much more reliable, is disqualified a priori in nearly all certification processes I've seen.
(needless to say, every time the things they take as proof tend to be ... less than proof)
Plus we've all been in a company having this discussion. "For the price of this one 'reliable' server we could have 20 normal ones, and they would be a hell of a lot more reliable together. Hell, just give me 3 of the cheap ones and I'll make it more reliable". And we all know what the boss's answer and the resulting reliability was.
3) You assume no regulatory capture (or outright dishonesty on the part of government employees and/or lawmakers)
So you would trust a random baseband chip to never ever in a thousand years lie to you about the success or failure of its operation or its internal status? Or to behave in an unexpected way that interferes with the core function of the device? E.g. by suddenly sending random junk to the main CPU?
How do you verify that the display controller is not acting up and not blanking out a region of the display that contains essential data?
If you cannot prove things like that for your medical device, you won't be allowed to sell it.
If only. I work in the same town as a big medical device manufacturer, and several co workers over the years had worked there and immediately nope-ed out on moral grounds. Stuff like panics on anything out of the order in a morphine pump that defaults to full on while resetting. Apparently those killed a few people.
Meeting the FDA guidelines is more about finding the cheapest way to technically meet the spec rather than trying to build something safe.
I do get that such a device panics on the smallest error. But then it is supposed to go into a safe mode. "Full on" does not seem safe to me. Full off and emitting an acoustic alarm until it is actively acknowledged would be the right thing to do. Whoever designed this thing to do what it did was frankly a morron.
But gaming the certification process is unfortunately also a thing. In the EU the certification is performed by private companies who are themselves certified by the government for this job. The kicker is that they are competing against each other on a free market. Potentially shopping around for the most lenient certification process could be a thing. I haven't witnessed it yet but it certainly is possible in that system. The thought alone scares me.
Oh yeah EU certifications are sadly super weak. Now to your original point, I think it's quite possible to have an open review system. Linux and the likes have shown great capabilities in finding and fixing issues fast. With a national effort to ensure paid engineers it's not science fiction.
I think primary care workers are generally underpaid and specialists are generally overpaid. That said there's a lot of variation in the data. If you look at some studies of why the US spends more than other countries on healthcare without better outcomes, the difference in prices between the US and other systems is a significant driver. Labor (particularly physicians), drugs and administration costs more here than elsewhere [1]. It's true doctors go through a lot of training, but I'm not terribly sympathetic to stories of a surgeon not being paid enough. Family practice, primary care I'd be more inclined, but even there we spend more than most countries it seems. [EDIT] I'm less sure on what the "blue collar" healthcare workforce (health aides, etc.) wages looks like, I would totally buy that many are underpaid.
Administrative costs are a terrible drag, and I say that knowing that currently I am personally paid by administrative costs. We could bring them more in line to the international norm, but we'd still be substantially more expensive because administrative costs are still not the bulk of total expense.
Drugs are an interesting story because the US effectively subsidizes international drug costs [2]. Also, drugs are a way to stop way more expensive interventions (better to take a $100K drug that cures Hepatitis C than get a liver transplant that will cost far more than that and won't give as good quality of life), so maybe in some ways we should spend more there if the treatments are worthwhile. That said there are disturbing pricing trends in the industry that are clearly exploitative.
In summary, what nationalized systems buy you is fewer administrative costs (good), price controls on medical services (maybe good, but necessarily docs get paid less down the line), price controls on drugs/devices (maybe good, but maybe trade-offs in developing treatments that are less expensive than other interventions). All of the parties involved will be fighting this "efficiency", some more justifiably than others. Who knew healthcare could be so hard?
I'm pretty sure most people, even administrators in hospitals and insurance, are overworked. Health care in the US is the textbook definition of big business. The only people making easy, stupid money are shareholders.
Having family in healthcare and being a lifelong patient due to a congenital defect, a job in software (including management obv) is a breeze compared to health care worker jobs across the board. They require more "schooling" (via mind-numbing training), are underpaid, have much more serious consequences from mistakes, have less oversight, work longer hours and under worse conditions. It's disheartening.