Just a quick question, do you write software? Do you have a legal or economic background? It seems pretty clear to me that anyone suggesting that software bugs in applications that have no risk of causing physical harm should have criminal liability has no idea what they are talking about and what damage such a law would cause.
Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.
Hi. I've worked in medical software repeatedly. I totally want to deal with HIPAA. It's a good idea for clients (the people who actually matter) and it's not nearly as difficult a prospect to work with as people say. The set of demands it makes upon you are small and reasonably constrained and are nearly all process-based rather than technical. Where it is technical, plenty of folks will sign a BAA for you to take big chunks of the technical stack off your hands, too.
"But HIPAA" has never, in my experience, been employed except by people who find the idea of doing the right thing inconvenient or inconveniently expensive. (It is virtually never that hard and its benefits are clear.)
There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.
>"But HIPAA" has never, in my experience, been employed except by people who find the idea of doing the right thing inconvenient or inconveniently expensive. (It is virtually never that hard and its benefits are clear.)
Thank you for directly attacking my character without even addressing my actual argument.
I'm not arguing against HIPAA, I'm arguing against such regulations in spaces that don't require that kind of sensitivity. I think that medical data absolutely requires the protections it has. But it absolutely has had the unintended consequence of making current medical data more insecure and stifling innovation in the space. Most doctors don't even follow HIPAA compliance sending patient medical records over email.
I would estimate that 40% of doctors today are not compliant with HIPAA, sending X-rays and other similar patient information over email with providers that they haven't signed BAAs with.
>There are reasons for not modernizing tech stacks in the medical space. HIPAA is, in every case I've ever observed, not a meaningful one.
Then please enlighten us. Up until a few years ago (maybe even just a year) you couldn't use AWS to host medical data. Today you can't use Google Cloud to host medical data unless you are a large enough business to be able to get into contact with one of their sales reps. Can you even sign a business associate agreement with digital ocean? So up until a year ago you could not even have a small healthcare startup hosted on the cloud. Please explain to me how this hasn't stifled medical software innovation.
If it isn't HIPAA it's some other outdated regulation.
> Thank you for directly attacking my character without even addressing my actual argument.
"But it's hard, for no actual reason I will define" is not a meaningful argument. So--when one hears hoofbeats, think horses, not zebras.
> I would estimate that 40% of doctors today are not compliant with HIPAA, sending X-rays and other similar patient information over email with providers that they haven't signed BAAs with.
Probably true! But that's their own damned fault. Medical has Artesys and similar, dental has Apteryx and similar. This problem is largely solved but for hands-on unwillingness to use them.
Those providers should be nailed to the wall, the wall should not be torn down for them.
> Up until a few years ago (maybe even just a year) you couldn't use AWS to host medical data.
AWS has been signing BAAs since at least...2013? I believe the first time I looked into it was 2014. But, regardless--if your innovation was so tremendously stifled by this, I'm not particularly sympathetic. I've been running my own services and writing them too for at least a decade and you can do thou likewise, I promise. I am, however, saying that today it's very easy to do so 'cause Amazon is all-too-happy to sign one.
Also, I haven't had to use GCP for HIPAA-covered entities--found their BAA pretty easily though!--but even assuming you're correct the idea that you have to, hiss, talk to somebody before getting them to take some legal responsibility for your held PHI, I don't find that to be a particularly nasty requirement. I still find it odd that AWS will just let you sign right through with AWS Artifact.
Azure's all-too-happy to sign one, too. Not that I'd recommend it.
> It seems pretty clear to me that anyone suggesting that software bugs in applications that have no risk of causing physical harm should have criminal liability has no idea what they are talking about and what damage such a law would cause.
So you're fine with financial losses, loss of privacy, and the material harm that goes along with both? Disregarding the impact that data breaches imply is just naive.
> Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.
I wrote medical device software for more than a decade. HIPAA has nothing to do with it. Many systems run on outdated platforms because the cost of replacing them is deemed to outweigh the benefits. That determination is debatable on a case by case basis, but in practice we see a hell of a lot more damage being caused by breaches of companies running on modern technology than we do e.g. hospital systems or LIMS.
Uh huh, I'm sure it's just that easy, right? I mean, I'm certain it's an even playing field even for someone like me who has the money to hire an attorney. Hell, why regulate these industries at all? We can just file civil suits, right? Even if you win it costs less for them to settle than it does to change the way the do business/security.
We regulate the finance industry not because of a risk of physical harm, but because financial harm can be equally serious and civil suits do not act as a sufficient deterrent to bad behavior by the powerful. Why do you feel this sort of thing is different? I believe the only real difference is that this sort of thing is new, not well understood by most, and we just haven't caught up.
HIPAA only carries criminal penalties when someone knowingly discloses covered information - not a software bug. Until the bug is identified at least. For the most part HIPAA is enforced with civil penalties.
And your "nightmare" scenario of (civil) liability flowing from programming bugs already exists in the investment world and it hasn't come apart at the seams. Google Axa Rosenberg. A coding error in their trading algorithm went undiscovered for two years. Negligent for sure, but not why the SEC went after them. The problem was they didn't promptly disclose the error to investors and they didn't promptly correct it. Algorithmic trading firms should have mechanisms to catch errors, correct errors, and disclose those errors to investors. And after seeing Axa Rosenberg's $250 million fine and Rosenberg's lifetime ban from the industry guess what they all implemented?
> Covered entities and specified individuals, as explained below, who "knowingly" obtain or disclose individually identifiable health information, in violation of the Administrative Simplification Regulations, face a fine of up to $50,000, as well as imprisonment up to 1 year.
>
> Offenses committed under false pretenses allow penalties to be increased to a $100,000 fine, with up to 5 years in prison.
>
> Finally, offenses committed with the intent to sell, transfer or use individually identifiable health information for commercial advantage, personal gain or malicious harm permit fines of $250,000 and imprisonment up to 10 years.
My company's lawyers disagree. I'll go with my company's lawyers' judgement over a group that exists solely to protect the interests of its member doctors.
Are these lawyers you have talked to and gotten meaningful and nuanced advice from, or are they lawyers your bosses have talked to and derived maximally avoidant policies from? I'm not saying that you shouldn't have policies that fit your risk profile, but I ask because I have been in those former conversations (and I have done a nontrivial amount of auditing+compliance work in this space) and have never come away with such an impression, while at the same time the level of perceived risk that your bosses derive from those conversations can be entirely untethered from the level of risk that actually exists. (This space is full of people saying "oh, HIPAA means we can't do that" as shorthand for "I don't want to do that," after all.)
If you read the sibling comment where Spooky23 cites the HHS page on HIPAA, it might be worth ruminating on that versus your interpretation of why your company's lawyers lay out the training in the way that they do.
That they have a different company risk profile doesn't necessarily change the facts at hand. And, TBH, they don't have to tell you the truth if it helps achieve their immediate goals. (They can tell you you'd be personally and criminally liable. It might make you do what they want better. It might also not be true.) Or it may all be in good faith. But what you describe doesn't square with anything I've ever worked with, at multiple clients and employers.
They are wrong generally speaking. Willful conduct is the standard for criminal liability. A developer in good faith introducing a bug or inheriting one from a third party is not in that situations
My guess as to why the draconian position is more about the internal process. You have to identify and disclose breaches in a timely way; if you don’t the company is at risk.
“Criminal Penalties. A person who knowingly obtains or discloses individually identifiable health information in violation of the Privacy Rule may face a criminal penalty of up to $50,000 and up to one-year imprisonment. The criminal penalties increase to $100,000 and up to five years imprisonment if the wrongful conduct involves false pretenses, and to $250,000 and up to 10 years imprisonment if the wrongful conduct involves the intent to sell, transfer, or use identifiable health information for commercial advantage, personal gain or malicious harm.”
Case in point look at the quality of medical software today. Hospitals still use windows xp and other completely insecure and outdated software. Because absolutely nobody wants to deal with the nightmare that is HIPAA.