Hacker News new | past | comments | ask | show | jobs | submit login

Clearly the countermove is to train an image generator on your face and flood the internet with fake data. Have your face appear tagged in weird remote places, with several made up names, on statues, in ancient manuscripts.

Pretend to be dead, pretend to be a medieval monk, a war hero from the 1940s, a fictional character. Poison those ratty algorithm with nonsense and hide within the noise.




The countermove is what D-ID (YC funded, IIRC) said they would do and then changed their mind: train a face recognition library to generate modified versions of a picture such that a human would still believe it's the same person, but different enough that face recognition systems think it's someone else. Then use those pictures in all of your internet socials.

Edit: according to this article they did develop that, but they don't offer the service to individuals: https://techcrunch.com/2018/09/05/d-id-launches-its-initial-...


Facial recognition algorithms are a set of different and developing models. There's no guarantee any system aimed at the weaknesses of an existing algorithm would work against different or future facial recognition systems.


I want someone to make the ugly shirt.


Or just give up on privacy in the sense you mean. Ultimately we’re all living in meatspace. I was happier when I just accepted it. I’ve been on both sides of the privacy fence — when I was in my 20s I cared deeply about not being known, as you say. But it was self destructive, and I’ve seen many people who end up as reclusive as I was.

It’s fine to have privacy. And it’s necessary. But obsessing about it too much can really get to you, in a way that’s sort of hard to articulate.

If someone knows the city you live in, is that really a big deal?

Now, there is the flip side of this. I recognize this is a privileged position to have. Some people’s lives have been ruined due to harassment of their physical location, and I am fortunate not to have to worry. It’s particularly easy for me to say this as a man, since women have a very different threat model.

But as with all things, it’s a balance. The cold truth is that technology will only accelerate the trends that first Facebook and now Clearview are exploiting.

Just be happy and live a nice life. I hid too long.


> But it was self destructive, and I’ve seen many people who end up as reclusive as I was.

I won't give up; but I no longer freak out about it.

I am reclusive, apparently; at least, some friends have said so. I'm certainly not a misanthrope; I like people, and in general I trust them. But I don't trust "social media", in the Faceache/Twitter/X sense, and I've never had an account with those services. I have a couple of online accounts, and HN is one of them.

I've had one or two online-dating accounts, which I regret. If I try dating again, I'll do it organically, by joining some local interest group.

Being a "recluse" doesn't mean being some kind of hillbilly, who lives in the back of beyond, and answers the door with a loaded shotgun; it just means that you aren't desperate for friends.


I’ve noticed this about myself as I’ve aged, starting at one extreme, overcompensating reaction to the other extreme, and finally settling somewhere in the middle that I’m content with.

Beliefs about religion, politics, family, privacy, hobbies, nutrition, science, and probably a lot more I can’t think of.

The reactionary overcompensation phase, in hindsight, does always seem very “warn everyone” focused, before finally having that moment where it clicks that it’s neither extreme, and that the middle is also okay.


Agreed, I’m in a similar boat. Really it’s out of our control how privacy-invasive laws/tech progress. Yes we can vote against them, and should, but they’ll find a patriot-act to shove something we voted against previously in.

And at that point, to me, better to practice stoicism and accept you cannot control things you cannot control, and thereby not worry about those things.

Same with AI. Yes, learn the tools to stay ahead. But pushing to stop it is a fool’s errand. “Best” case we make it so only the corporate few are allowed to make sophisticated AI, “worst” case everyone with ability has their own model, and it pervades society.

Find the things you love, surround yourself with friends and family, and live life. They can’t realistically stop you from stepping away from a screen and enjoying a fire side chat with your mates.


In the UK the policing minister plans to 'integrate data from the police national database (PND), the Passport Office and other national databases to help police find a match with the “click of one button”' [0].

It becomes very difficult to escape mass surveillance via facial recognition if passport (and presumably soon drivers license) photos are used.

[0] https://www.theguardian.com/technology/2023/oct/06/mps-and-p...


Scary thought:

Let's say you do this. And facial data DBs continue to grow. And eventually they become so widespread that most people are in them. And facial recognition becomes so ubiquitous that it is necessary for everyday transactions. And you can't participate in large swatches of society because you are, as far as the technology is concerned, a statue, or dead.

Far fetched? It's happening with cell phones. Many things are not possible, or vastly more difficult, if you don't have a smart phone.


How about take existing photos of others and do the same thing. Make everyone a statue.


Good! A 0.01 - 1% failure rate and people will become very pissed off at the system, making it politically untenable. Which for a small country means cloning and poisoning the internet for 200 - 20.000 people. Not too impossible.


>A 0.01 - 1% failure rate and people will become very pissed off at the system, making it politically untenable

I'm quite sure I don't live in a society where something (even greatly) inconveniencing a mere 0.1% of the public makes it politically untenable, and I'm very nearly as sure that you don't either.


I barely have time to get the laundry done and supposed to do this?


In the long run, the only option is to privatize knowledge entirely.

Machines don't care what numbers they run, cpus and users should be blind to the meaning of app code. .exe's should be absolute black boxes by default.

When we have the techniques to privatize knowledge from machines, the desire for privatized faces and biometrics, will rise.

The entire premise for biometric tracking is that mental locks don't work (passwords, ect). It's not true.

Privatize knowledge and mental locks work fine.


The theory of what’s possible seems less important here than the practical reality of what the lowest common denominator is capable of, and what they will choose out of convenience.

The average person, or even the average developer, isn’t going to use homomorphic encryption or be on point with their opsec.


Not a bad idea, to be honest.


I like it. Fight fire with fire. Also every legitimate photo should be AI enhanced so that it is impossible for the creeps to tell real photos from generated ones.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: