Hacker News new | past | comments | ask | show | jobs | submit login
Taser has started its own in-house AI unit (vocativ.com)
33 points by mattlevan on Feb 17, 2017 | hide | past | favorite | 26 comments



I work at TASER/Axon, the company asserted to have this technology. This article, which appeared on the front page of the Drudge report this morning, is extremely disingenuous.

They have repackaged a press release that announced our investments in computer vision and machine learning and have made a huge leap that we have built or are currently building facial recognition into our cameras. This is false.

We believe that AI has the potential to streamline police work and reduce inefficiency. Police officers spend a substantial amount of time (65% according to a Netherlands study) doing administrative tasks like report writing and filling out forms. Our goal is to reduce this wasted time so that officers can spend more time in the street and with their communities.

If this sounds interesting, we're hiring! Come join our team in Seattle and write code that saves lives.


Thanks for sharing your perspective. Reducing inefficiency would be great, but what is TASER/Axon's view about this technology's effects on privacy and civil rights (EDIT: and on their responsibility to create products that protect rights and sell them to users who will do the same).


What is the technology/stack/infrastructure used there?


The title of the article is misleading. The article actually states that Taser's cameras do not have facial recognition. They use face detection to blur and redact faces from footage captured by body cameras. And far from real time they say that they hope to take that redaction process that takes eight hours to 1.5 hours for an hour video clip.


14 years ago I told my teacher at a very respected uni that real-time facial recognition would surely be used by gov for security.

He said I was insane and that this would be extremely hard to implement and it was very improbable that it would ever happen. He basically demoralized me in front of my colleagues.

Needless to say he failed to convince otherwise and I lost all respect for him.


One rule of thumb I've found very useful is that if someone says a proposed or imagined outcome is impossible rather than asking how would that work, it's probably bullshit.

Obviously if they can follow through with a good reason like pointing out a violation of thermodynamics or some inherent mathematical paradox you should take that seriously, but the more reflexive the dismissal the less well-founded it usually turns out to be.


Per the comments noting that this (police-mounted face recognition) isn't a thing:

I'm amazed that pervasive face recognition isn't a thing. I figured it would be the killer app for Google Glass, popping up notes about everyone you see. Apple & Facebook could leverage it from the face tagging in Photos & FB. Police could be well-served by a vest-mounted lightweight system giving likely identifications of encountered suspects (or anyone) from mugshots et al. Thing is, all the technology is there waiting for use; it's a killer app just waiting for society to put up with it (just like privacy invasions that were intolerable back when and are normal today).

(I'm not necessarily advocating it, just figuring it's inevitable and ready to explode onto the scene.)


It exists in Russia under the name FindFace.


The only viable solution to this is regulation, and our only real hopes for it are the ACLU and the EFF.


It's too easy to implement, so it can't be regularized. The only thing you get in return is that you'll be able to recognize the faces of police officers as well with the same technology. The world is shrinking, and it will feel like we're all living in a huge village.


> and it will feel like we're all living in a huge village

A very apt way of putting it. For a while now I've held the opinion that anonymity (and to some degree privacy, at least the privacy you expect from anonymity) as we know it is a short-lived artifact of our accumulation into larger groups, where it's not possible to know everyone around you and almost everything about them. The farther back you go, the less anonymity there was to the people that you saw every day.

In that respect, what we're seeing is a return to the norm. That doesn't mean it's better, but I'm not sure there's a way to actually prevent it exept on an individual basis, and then only with a lot of work on the part of that individual.


> In that respect, what we're seeing is a return to the norm

In no way. Note that the article is about police body cameras. This is less about technology than it is about the power of the state over individuals.


I'm not sure what you think gives the state a monopoly on this. There's massive amounts of information about most people online, and there will be even more in the future. I think it's likely that in the future even if you've never put a single picture of yourself online, there will likely be enough images of people in public and associated information to identify you anyway (use a credit card at a store? Why wouldn't they choose to associate your image from in-store cameras with that customer record?).

All the more egregious aspects of online marketing and tracking are coming to the physical realm. You too will likely be able to access this information for a price. Good luck trying to stop it. Honestly. The world would be a better place in my estimation if you could. I just have little faith in measures designed to combat economically incentivized behavior.


> It's too easy to implement, so it can't be regularized

A backdoor for the state to have mic and camera access in all iPhones is very easy to implement.

But people don't want it, so it's not (currently) there.

Clipper chips have been possible for over 20 years, but as a society, we decided that we didn't want them. What people want is more important than what's possible.


It can only be implemented if you have a database of faces. You can probably get that from mug shots, but false positives may be challenging without a database of everyone else.

And I think it's still quite hard to do for video and real-time though.


  The only thing you get in return is that you'll be able to recognize the faces of police officers as well with the same technology.
Souveillance at its best


You can't regulate technology out of existence; the harder you try the greater the potential rewards for using it asymmetrically. I can think of legal objections, but I can also think of equally good arguments that circumvent those objections. Rather than wishing a new technology away, which has never ever worked, it's better to assume it will be the norm and then try to imagine what countermeasures would be deployed in response.


> You can't regulate technology out of existence

No, but you can regulate it into regulation. This is why police need a warrant to pick the lock on your front door, or to use a thermal imaging camera on your home.


Or why police need to announce that they smell weed before bashing in your front door to check. Or make a statement that they heard someone screaming for help before bashing in your front door. Or make a statement that they heard gun fire so they can bash in your front door. My point being police don't always need a warrant to enter a place. Just like they don't need warrants to MITM cell towers.


Or wear make-up every time you leave your home.

https://cvdazzle.com/


Facial Recognition - This is who person is based on this face

Facial Detection - This part of image is a face

The article is referring to facial detection and object classification and not, as is mentioned, facial recognition. Also, it would certainly NOT be real time. Computers are still quite poor at this even if they take a super long amount of time and processing power.


The article is referring to facial detection and object classification and not, as is mentioned, facial recognition. Also, it would certainly NOT be real time. Computers are still quite poor at this even if they take a super long amount of time and processing power.

Visiting the links presented in the article, you'll find this is possibly not correct. The article may have misrepresented but it does lead to a study that specifically mentions "facial recognition" capabilities on these devices.

"A Market Survey on Body Worn Camera Technologies" from Johns Hopkins University Applied Physics Laboratory and sponsored by the Department of Justice’s National Institute of Justice in November of 2016 found 9 different devices capable of facial recognition[1]. The software itself comes from a handful of startups and technology companies.

They define facial recognition: "Facial recognition features allow the user to identify or verify a person from a digital image or a video frame."

Relevant to Hacker News, this article is timely probably due to the mentioned Dextro being acquired by AXON[2] (owned by TASER) to utilize their software in their body camera[3].

[1] - https://www.ncjrs.gov/pdffiles1/nij/grants/250381.pdf

[2] - https://medium.com/@dextro_co/dextro-announcement-d21212463b...

[3] - https://buy.taser.com/axon-body-camera/


Ok, we replaced the submitted title ("Police Body Cams Have Real-Time Facial Recognition") with the first part of the subtitle. If someone can suggest a better (more accurate and neutral) title for the article, we can change it again.

The submitter was correct not to use the article's title, because that one was too baity. But it's usually much better to pick alternative language from the article itself—such as a title or representative sentence—than to make up a new title oneself.


And who would have thought back in 2007 that Facebook would be able tot identify people in photos with relative accuracy in close to real time? It's possible and there are large social incentives to do it because it has the potential to benefit people in many contexts. Those are the sufficient conditions for any given technology to become near ubiquitous.


Guy Fawkes. Guy Fawkes. Guy Fawkes.


So the system recognizes the mask as a known mask, recommending immediate application of laws punishing wearing masks in public for nefarious purposes, which is rather the whole point of wearing that particular mask.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: