Hacker News new | past | comments | ask | show | jobs | submit login

I see other people in the comments saying that this might be used to limit people's rights. I don't view that as a good reason to not pursue this research.

We have existing legal frameworks for ensuring people's rights are protected. Doctors can't just kidnap someone, courts are involved. In many cases those rights and protections should be strengthened.

The fact that those protections and frameworks aren't perfect shouldn't stop doctors and medical researchers from doing their job, which is to treat patients.




I'd be fine with this being developed in a civilian clinical setting.

I am not fine with the military developing this and then considering it's use on service members.

Particularly troubling are the lines "since patients will often tell their clinicians what they think the clinician wants to hear rather than how they are truly feeling" and "on aggregating preconscious brain signals to determine what someone believes to be true."

They see an issue with a voluntary clinical process and they want to remove the voluntary aspect of it. To me, it seems they are interfering with a process failure they haven't categorized correctly and are attempting to remove the patient from their own process of care.

If the intention is to use this on service members without their explicit request, this presents one of the slipperiest slopes I've ever seen.


"I am not fine with the military developing this and then considering it's use on service members."

This. The rights and protections you have in the military, as well as the military judicial system, are vastly different from the civilian world. I have very little confidence in even the civilian side (so many abuses and so much incompetence).


That's an extremely good point. It's easy for me to forget that DARPA is primarily in the business of war.

I hope there is a version of this that's used to help people, but I have changed my mind and now agree that it's inappropriate for a military to be developing a technology like this.


How can they train a dataset knowing that their training participants were also telling clinicians what they want to hear rather than how they were truly feeling? In any event the truth is unknowable, so it is wrong to assert it with authority.


It doesn't matter how many legal frameworks there are. Neither courts nor doctors can read people's minds so they are basically kidnapping people based on something that is barely more than a guess. There are thousands of stories of people getting sectioned for something stupid like having a dark humour and telling an off-colour joke. There are also thousands of stories of people who had been plotting their suicide for months, reached out to the hospital for one last attempt for help, got turned away for supposed attention-seeking, and then killed themselves.

Anything that can elevate institutionalization to more than mass guessing has to be a plus. Though we also do need to solve the problem that these institutions are so often nightmares to be in, so that suicidal people are getting what they need instead of just being imprisoned.


there is no framework whatsoever for neural monitoring. This is willfully dishonest, and more than a bit worrying to see trotted out as a defense of one of the most invasive technologies currently in development.


That's not true at all, at least in a civilian context. As others have pointed out, the fact that this is being developed by DARPA means that those legal restraints may not actually apply, which I agree is very disturbing.

First of all, it would be health data protected under HIPAA.

It could also be relevant to involuntary hospitalization, where the current standard is "clear and present danger." In general, you can't be involuntarily hospitalized for saying something like "I'm having thoughts of suicide," but could be involuntarily hospitalized for talking about a specific plan for suicide or actually attempting suicide. The idea that this technology could legally demonstrate that someone is a clear and present danger to themselves is far fetched. I'm not saying the legal system is perfect or even good, but it's not 100% stupid. Judges can and do distinguish between statistical and non-statistical evidence.

Red flag laws/ERPOs use a less stringent standard from what I understand, so it is somewhat more likely (although still unlikely overall, I'd argue) to be applicable in that case.


In Sweden they can certainly do so, if a person is a danger to themselves or others (as declared by two doctors) they can be put into closed mental care with police escort and drugged to the extent that they can't speak for themselves without any right to legal protection. The protection of their life is considered to be more important. I don't know about US law, but probably other countries has similar laws.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: