Hacker News new | past | comments | ask | show | jobs | submit | 90d's comments login

I'm never eating the JUST EGG


It tasted fine.

There are recipes for egg-equivalent online, typically involving Kala namak aka black salt, if you want to try doing it yourself.


Ah just what I needed, more anxiety.


And if you put it into the market today they can help you lose the rest. Better late than never!


"Free". The worst four-letter F word in America.


Large problem with social media ad targeting recommendations. They are completely unweighted.

What you think you are targeting : 'business owners researching startups and investing'.

And somehow your ads get pushed to 'people with interest in dogs' when you go with the recommendations.


I am in Bangkok right now and while there is public transportation at every corner the air quality is still super low. I took this post as the rich are falling out of love with LUXURY cars. I got passed by a Lamborghini today while walking the streets and said to my friend, while everyone else was busy trying to snap pictures, that I would only be impressed if he could do his own oil change.

I expect $tatus $ymbols to continue falling out of favor and be replaced by self sufficiency. Rich are also falling out of love with alcohol and falling in love with fitness. This is good for everyone.


Also I will never give up my ICE vehicle back home, which is a 20 year old diesel that has 270,000 miles on the clock. I am an original hacker/tinkerer and absolutely despise the tech and guardrails included in ALL modern cars but ESPECIALLY the EVs.


Food for thought, someone who struggles to fit a gym routine into their life is not someone who knows how to train their muscles efficiently. Thanks for the article, Australia.


The road to hell [...] good intentions.


I see other people in the comments saying that this might be used to limit people's rights. I don't view that as a good reason to not pursue this research.

We have existing legal frameworks for ensuring people's rights are protected. Doctors can't just kidnap someone, courts are involved. In many cases those rights and protections should be strengthened.

The fact that those protections and frameworks aren't perfect shouldn't stop doctors and medical researchers from doing their job, which is to treat patients.


I'd be fine with this being developed in a civilian clinical setting.

I am not fine with the military developing this and then considering it's use on service members.

Particularly troubling are the lines "since patients will often tell their clinicians what they think the clinician wants to hear rather than how they are truly feeling" and "on aggregating preconscious brain signals to determine what someone believes to be true."

They see an issue with a voluntary clinical process and they want to remove the voluntary aspect of it. To me, it seems they are interfering with a process failure they haven't categorized correctly and are attempting to remove the patient from their own process of care.

If the intention is to use this on service members without their explicit request, this presents one of the slipperiest slopes I've ever seen.


"I am not fine with the military developing this and then considering it's use on service members."

This. The rights and protections you have in the military, as well as the military judicial system, are vastly different from the civilian world. I have very little confidence in even the civilian side (so many abuses and so much incompetence).


That's an extremely good point. It's easy for me to forget that DARPA is primarily in the business of war.

I hope there is a version of this that's used to help people, but I have changed my mind and now agree that it's inappropriate for a military to be developing a technology like this.


How can they train a dataset knowing that their training participants were also telling clinicians what they want to hear rather than how they were truly feeling? In any event the truth is unknowable, so it is wrong to assert it with authority.


It doesn't matter how many legal frameworks there are. Neither courts nor doctors can read people's minds so they are basically kidnapping people based on something that is barely more than a guess. There are thousands of stories of people getting sectioned for something stupid like having a dark humour and telling an off-colour joke. There are also thousands of stories of people who had been plotting their suicide for months, reached out to the hospital for one last attempt for help, got turned away for supposed attention-seeking, and then killed themselves.

Anything that can elevate institutionalization to more than mass guessing has to be a plus. Though we also do need to solve the problem that these institutions are so often nightmares to be in, so that suicidal people are getting what they need instead of just being imprisoned.


there is no framework whatsoever for neural monitoring. This is willfully dishonest, and more than a bit worrying to see trotted out as a defense of one of the most invasive technologies currently in development.


That's not true at all, at least in a civilian context. As others have pointed out, the fact that this is being developed by DARPA means that those legal restraints may not actually apply, which I agree is very disturbing.

First of all, it would be health data protected under HIPAA.

It could also be relevant to involuntary hospitalization, where the current standard is "clear and present danger." In general, you can't be involuntarily hospitalized for saying something like "I'm having thoughts of suicide," but could be involuntarily hospitalized for talking about a specific plan for suicide or actually attempting suicide. The idea that this technology could legally demonstrate that someone is a clear and present danger to themselves is far fetched. I'm not saying the legal system is perfect or even good, but it's not 100% stupid. Judges can and do distinguish between statistical and non-statistical evidence.

Red flag laws/ERPOs use a less stringent standard from what I understand, so it is somewhat more likely (although still unlikely overall, I'd argue) to be applicable in that case.


In Sweden they can certainly do so, if a person is a danger to themselves or others (as declared by two doctors) they can be put into closed mental care with police escort and drugged to the extent that they can't speak for themselves without any right to legal protection. The protection of their life is considered to be more important. I don't know about US law, but probably other countries has similar laws.


Make AI. Choose next path :

(1) herculean effort to make it confirm to what they have established as politically correct.

(2) improve AI.


Everyone wants "politically correct" AI.

Wanna keep it from turning murderbot or paperclip maximizer? Welcome to the arena of politics of correct values. Wanna use it for longevity research? Same.

Some people also would like to make sure it isn't helping stoke antisemitism or genocide, but apparently that's controversial for some reason, so guess it's political.

(And that's before we get to the usual issue that people using "politically correct" as the drive-by aspersion that's so unfortunately common are usually more ideological than those they criticize and very much have their own political correctness they're anxious to impose by any means available to them...)


(1) herculean effort to make it treat everyone with kindness and respect.

(2) improve AI.

(3) work out how to do both at the same time.

FTFY


The majority of us will be long dead before then, even moreso if your random sci-fi pill is released into the wild ( never would be ).


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: