Hacker News new | past | comments | ask | show | jobs | submit login

I'd be fine with this being developed in a civilian clinical setting.

I am not fine with the military developing this and then considering it's use on service members.

Particularly troubling are the lines "since patients will often tell their clinicians what they think the clinician wants to hear rather than how they are truly feeling" and "on aggregating preconscious brain signals to determine what someone believes to be true."

They see an issue with a voluntary clinical process and they want to remove the voluntary aspect of it. To me, it seems they are interfering with a process failure they haven't categorized correctly and are attempting to remove the patient from their own process of care.

If the intention is to use this on service members without their explicit request, this presents one of the slipperiest slopes I've ever seen.




"I am not fine with the military developing this and then considering it's use on service members."

This. The rights and protections you have in the military, as well as the military judicial system, are vastly different from the civilian world. I have very little confidence in even the civilian side (so many abuses and so much incompetence).


That's an extremely good point. It's easy for me to forget that DARPA is primarily in the business of war.

I hope there is a version of this that's used to help people, but I have changed my mind and now agree that it's inappropriate for a military to be developing a technology like this.


How can they train a dataset knowing that their training participants were also telling clinicians what they want to hear rather than how they were truly feeling? In any event the truth is unknowable, so it is wrong to assert it with authority.




Consider applying for YC's W25 batch! Applications are open till Nov 12.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: