Hacker News new | past | comments | ask | show | jobs | submit login
Clearview AI: secretive company that might end privacy as we know it (2020) (nytimes.com)
85 points by XitN on Oct 7, 2023 | hide | past | favorite | 67 comments



Clearly the countermove is to train an image generator on your face and flood the internet with fake data. Have your face appear tagged in weird remote places, with several made up names, on statues, in ancient manuscripts.

Pretend to be dead, pretend to be a medieval monk, a war hero from the 1940s, a fictional character. Poison those ratty algorithm with nonsense and hide within the noise.


The countermove is what D-ID (YC funded, IIRC) said they would do and then changed their mind: train a face recognition library to generate modified versions of a picture such that a human would still believe it's the same person, but different enough that face recognition systems think it's someone else. Then use those pictures in all of your internet socials.

Edit: according to this article they did develop that, but they don't offer the service to individuals: https://techcrunch.com/2018/09/05/d-id-launches-its-initial-...


Facial recognition algorithms are a set of different and developing models. There's no guarantee any system aimed at the weaknesses of an existing algorithm would work against different or future facial recognition systems.


I want someone to make the ugly shirt.


Or just give up on privacy in the sense you mean. Ultimately we’re all living in meatspace. I was happier when I just accepted it. I’ve been on both sides of the privacy fence — when I was in my 20s I cared deeply about not being known, as you say. But it was self destructive, and I’ve seen many people who end up as reclusive as I was.

It’s fine to have privacy. And it’s necessary. But obsessing about it too much can really get to you, in a way that’s sort of hard to articulate.

If someone knows the city you live in, is that really a big deal?

Now, there is the flip side of this. I recognize this is a privileged position to have. Some people’s lives have been ruined due to harassment of their physical location, and I am fortunate not to have to worry. It’s particularly easy for me to say this as a man, since women have a very different threat model.

But as with all things, it’s a balance. The cold truth is that technology will only accelerate the trends that first Facebook and now Clearview are exploiting.

Just be happy and live a nice life. I hid too long.


> But it was self destructive, and I’ve seen many people who end up as reclusive as I was.

I won't give up; but I no longer freak out about it.

I am reclusive, apparently; at least, some friends have said so. I'm certainly not a misanthrope; I like people, and in general I trust them. But I don't trust "social media", in the Faceache/Twitter/X sense, and I've never had an account with those services. I have a couple of online accounts, and HN is one of them.

I've had one or two online-dating accounts, which I regret. If I try dating again, I'll do it organically, by joining some local interest group.

Being a "recluse" doesn't mean being some kind of hillbilly, who lives in the back of beyond, and answers the door with a loaded shotgun; it just means that you aren't desperate for friends.


I’ve noticed this about myself as I’ve aged, starting at one extreme, overcompensating reaction to the other extreme, and finally settling somewhere in the middle that I’m content with.

Beliefs about religion, politics, family, privacy, hobbies, nutrition, science, and probably a lot more I can’t think of.

The reactionary overcompensation phase, in hindsight, does always seem very “warn everyone” focused, before finally having that moment where it clicks that it’s neither extreme, and that the middle is also okay.


Agreed, I’m in a similar boat. Really it’s out of our control how privacy-invasive laws/tech progress. Yes we can vote against them, and should, but they’ll find a patriot-act to shove something we voted against previously in.

And at that point, to me, better to practice stoicism and accept you cannot control things you cannot control, and thereby not worry about those things.

Same with AI. Yes, learn the tools to stay ahead. But pushing to stop it is a fool’s errand. “Best” case we make it so only the corporate few are allowed to make sophisticated AI, “worst” case everyone with ability has their own model, and it pervades society.

Find the things you love, surround yourself with friends and family, and live life. They can’t realistically stop you from stepping away from a screen and enjoying a fire side chat with your mates.


In the UK the policing minister plans to 'integrate data from the police national database (PND), the Passport Office and other national databases to help police find a match with the “click of one button”' [0].

It becomes very difficult to escape mass surveillance via facial recognition if passport (and presumably soon drivers license) photos are used.

[0] https://www.theguardian.com/technology/2023/oct/06/mps-and-p...


Scary thought:

Let's say you do this. And facial data DBs continue to grow. And eventually they become so widespread that most people are in them. And facial recognition becomes so ubiquitous that it is necessary for everyday transactions. And you can't participate in large swatches of society because you are, as far as the technology is concerned, a statue, or dead.

Far fetched? It's happening with cell phones. Many things are not possible, or vastly more difficult, if you don't have a smart phone.


How about take existing photos of others and do the same thing. Make everyone a statue.


Good! A 0.01 - 1% failure rate and people will become very pissed off at the system, making it politically untenable. Which for a small country means cloning and poisoning the internet for 200 - 20.000 people. Not too impossible.


>A 0.01 - 1% failure rate and people will become very pissed off at the system, making it politically untenable

I'm quite sure I don't live in a society where something (even greatly) inconveniencing a mere 0.1% of the public makes it politically untenable, and I'm very nearly as sure that you don't either.


I barely have time to get the laundry done and supposed to do this?


In the long run, the only option is to privatize knowledge entirely.

Machines don't care what numbers they run, cpus and users should be blind to the meaning of app code. .exe's should be absolute black boxes by default.

When we have the techniques to privatize knowledge from machines, the desire for privatized faces and biometrics, will rise.

The entire premise for biometric tracking is that mental locks don't work (passwords, ect). It's not true.

Privatize knowledge and mental locks work fine.


The theory of what’s possible seems less important here than the practical reality of what the lowest common denominator is capable of, and what they will choose out of convenience.

The average person, or even the average developer, isn’t going to use homomorphic encryption or be on point with their opsec.


Not a bad idea, to be honest.


I like it. Fight fire with fire. Also every legitimate photo should be AI enhanced so that it is impossible for the creeps to tell real photos from generated ones.



Shot:

> At my request, a number of police officers had run my photo through the Clearview app. They soon received phone calls from company representatives asking if they were talking to the media.

Chaser:

> Clearview tries to pre-empt concerns with an F.A.Q. document given to would-be clients that says its customer-support employees won’t look at the photos that the police upload.


This is a bad company with a bad purpose for existing, and the people working with / for them should feel bad!


"customer-support employees" - nice dodge.


If only there was a body of democratically elected leaders who could help enact policies and laws for this sort of thing. In the interest of the majority of the citizen class


If only that body didn't delegate all of their work to unelected officials and basically sit there and collect a paycheck.....


There’s so much emphasis of government use of these tools, but I’m a lot more weary of a world where consumers have easy access to facial recognition.

Pimeyes is a free and shockingly effective way to figure out who someone in a photo is. High schoolers have already adopted it - poke around on TikTok and you’ll see them uploading images to figure out who that athlete from another school or random stranger on the street is. It’s already here.


1. it's from 2022

2. https://archive.ph/wmOpD


> Published Jan 18 2020, Updated Nov 2 2021


You're right, it was archived in 2022


If this site is going to rely on archiving sites to allow people to read articles, perhaps you should just post the archive link to begin with.


the archive website didnt write that article nor do they fund it. Their business model might be ridiculous, but they pay their reporters using that.


And? We're here discussing an article most can't access otherwise. Why give them this?

Find another source.


A relevant article that comments on this one: https://www.sfchronicle.com/business/article/The-person-behi...


This recent article from 404 Media details how a few companies are already making it possible to doxx individuals based on single images posted on social media using facial recognition AI.

I experimented with two of the free sites alluded to in the article (PimEyes and facecheck.id), using a photo of my face that is not online. Both sites turned up a lot of images of me, including some that were 10+ years old and a few I had never seen before, collectively making it possible to identify my name, employer, city of residence, partial career history, etc. I'd say the problem is not just Clearview.

https://www.404media.co/taylor-swift-facial-recognition-tikt...


I'm convinced any tool like this can only be fought by turning the tables, by using them against the people who abuse them, the owners/executives of the companies and legislators who allow the abuse.


I am starting to believe that this is, most likely, the best way out of a lot of the mess we find ourselves in, in the modern digital world. We, the laypeople, are under scrutiny in so many ways, our lives being recorded all the time. The basic asymmetry is that, on the other hand, the powerful have tools at their disposal to hide from scrutiny, and at the same time have access to the data collected on us. With well designed systems, we could remove this unfair advantage that they have, and level the playing field.


Would it be immoral with a community effort to track and publicize the movements of the executives and enablers of such enterprises? Similar to ElonJet.


>One reason that Clearview is catching on is that its service is unique. That’s because Facebook and other social media sites prohibit people from scraping users’ images — Clearview is violating the sites’ terms of service.

“A lot of people are doing it,” Mr. Ton-That shrugged. “Facebook knows.”

So I own the copyright to the image I uploaded to facebook I licensed it to Facebook they stole it from Facebook. Did they infringe my copyright?


In Europe: no. Training ML systems doesn't require a license.

In the US: we don't know yet. The law is not clear, and there are lots of suits in progress.


You misunderstand. They copied the literal image from facebooks computers to their own to train with.


And that may be permitted under fair use. It is not yet settled law in the US.


Under what theory would they be granted fair use for commercial use of images illegally acquired from a third party and ultimately jacked from people who they have no business relationship with?


LinkedIn lost a lawsuit over that subject, which effectively nullifies those terms.


Linkedin lost a case questionably decided by a the single most reversed circuit which is binding in only that circuit, which is liable to be decided differently in another, which will provide impetus for the supreme court to take it up. That decision is in fact in the middle of being ripped up. It is the opposite of decided law.

https://www.socialmediatoday.com/news/LinkedIn-Wins-Latest-C...

It's going to be ultimately decided that violating the TOS to scrape after being told not to is legally questionable and furthermore questionable acquisition further any already ridiculous claim to fair use.


> Did they infringe my copyright?

Yes. How are you going to stop them?


Seems like the logical thing would be for some attorney to do the leg work for a class action lawsuit and ask for essentially all of their money given the number of acts of infringement.


Related:

Clearview AI helps law enforcement match photos of people to their online images - https://news.ycombinator.com/item?id=22083775 - Jan 2020 (156 comments)


I once got into a shouting match with one of the founders of this company. In the middle of the argument, he said to me, quote, “you civil libertarian/liberty types are going to lose."

During the same conversation, he also implied that people who cared about liberty and privacy were dinosaurs. And companies like his were Chicxulub impactor.

I hope that in a few years, I can confidently say that he's wrong.


He's not wrong.

It doesn't matter, when liberty goes, so will the lifeblood of technological change.

The company founder types are building their own mausoleum.

When this is all said and done, the matrix, star trek, ect will only be populated by those guys. The rest of us will be happy and free from it, with real resources and real likes.


It's truly scary how zealous some people can become and how the double down in the face of criticism.

A little introspection and contemplation is always good.


The right to control our public exposure is just one facet of the right to privacy.

The right to privacy is personal sovereignty. Meaning control over oneself.

I am my data, my data is me. I own all data about me.

Recognizing our collective right to privacy prohibits facial recognition used for mass surveillance.

Firstly, each person must consent to being represented in those datasets.

Secondly, each person must consent to being tracked.

Have you given consent? I certainly have not. I wasn't even asked.


Didn't Google image search use to do this, until they chose to disable it?

Yandex remains somewhat usable for that purpose.


> goes far beyond anything ever constructed by the United States government

I believe that is not true


Source?


According the article the data base consisted of about 3 billion faces, in 2020 when the article was published. If you go on their web page it says that is now consists of 30 billion, quite an increase


Pretty sure that pretty much all of their business practices are GDPR violations. So we do have a choice - we could either end privacy, or have legislation that ends a potentially profitable startup.


"Won't someone think of the employees" is a common tactic companies use to avoid shutdowns, and that's against long standing laws and regulations. Seems to work often enough too.


I care about my privacy. Why should I care about profitability of startup that violates it.


The answer is pretty simple: Privacy matters more.

There are millions of ways for profit but you only have one privacy.


Privacy doesn't matter and GDPR is not a solution.


Your privacy may not matter to you. For most of us here, though, our privacy matters to us. So "privacy doesn't matter" seems like a rather incorrect generalization.


Regardless of what we preach the fact of the matter is that privacy has been eroded to hell and back. These governments that have eroded it were (mostly) duly elected by the people. If this is what the representation of the people is, then well, Privacy doesn't actually matter. Welcome to democracy, etc.

to anybody but a fringe minority.


Do you live in a glass house?


Logic clearly dictates that the needs of the many outweigh the needs of the few.


Indeed. But the many what?

From what I've seen in this world, the needs of the many-monied individuals outweigh the needs of the few-monied masses.


So find at least one EU citizen in their database and unleash the GDPR against them?


How would an EU law apply to a U.S. company if it has no assets or operations in an EU jurisdiction?


It applies if they work with PII of EU citizens. It might be to difficult to enforce the rules, but that is a different matter.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: