Reminds me of this RadioLab episode about a company that loiters over areas with a high resolution imaging platform that allows them to 'rewind' from the scene of a crime to see where the suspects came from:
If you're looking for more info, there is a whole book that goes beyond the content of this episode. I found it to be an informative, and eye-opening (albeit disconcerting) read.
All of these are good illustrations of the idea that by putting multiple data sources together, even "anonymous" data sets can be de-anonymized surprisingly easily.
I'm working on a project to study Reddit data to see how many users have unique footprints. I'm only looking at the subreddits where people participate. If you participate in at least five subs, you're probably the only person with that unique combination of subreddits.
It was a cool show, but a lot of the domains I am familiar with (hacking/cybersecurity, programming, compression) were very much made up for the storyline, so I suspect they didn't have much accuracy in domains I don't know much about (superintelligent AI, government surveillance, etc).
I highly doubt AI is being used here to match up the Open Camera footage with the instagram pic. Identifing humans as an object within a frame is trivial. Identifying individuals requires a considerable amount of custom training which is more than what is available from an instagram pic and a few video frames. I'm not saying it can't be done, it's just not that easy. I'd bet $100 bucks that most, if not all, of these matches were accomplished using good old stalking (matching post times to manually reviewing video data). Would be happily proven wrong but without any source code provided I stand firm with my opinion.
Have a look at pimeyes.com. A single picture is enough, often even with sunglasses. Though that site requires input photos of slightly better quality than what you would be able to get from surveillance webcams. Then again, try it often enough, and use AI enhancement/upscale, and perhaps you can just use the site to replicate the project.
I suspect it's likely that there's a fair amount of effort involved, but I don't think you'd have to work too hard to substantially narrow your search with machine learning. Even just a small classifier that outputs clothing color and timestamps would make a remarkable difference. Identifying humans (which you can see in the videos) and checking how long they linger for could inform potential candidates for folks getting their photo taken.
> OMG that's the most terrifying thing I have seen in ages.
Too late.
NTechLab scraped vk.com (Facebook-like Russian social network) profile photos and were running a public website "FindFace" that allowed to find profiles by photo of a face. They provide services to Russian government, e.g. it's used to find people from protests. Also FindFace was used to find and harass webcam models.
> Intel, SpaceX, Philip Morris, and dozens of other US companies were in a leaked database of users for a Russian facial recognition company
> The NTech Lab user list, which was shared with Insider by an anonymous source, includes more than 1,100 entries, with businesses and government agencies from more than 60 countries.
> In 2017, NtechLab face recognition algorithms were built into the Moscow city video surveillance system operated by the Moscow Department of Information Technology . The system uses the database of the Ministry of Internal Affairs to find correspondences to it on video. The alleged use of the system is the search for criminals and the fight against terrorism.
> In 2022, NTechLab was accused of assisting the Russian and Belarus government by tracking thousands of political activists which led to their unconstitutional detentions and arrests.
> In 2016, FindFace generated controversy because it was once used to deanonymize Russian pornography actresses and alleged prostitutes. These efforts were organized by users of a Russian imageboard Dvach who claimed that women in the sex industry are “corrupt and deceptive”, according to Global Voices. In addition, FindFace has been characterized as a major step in the erosion of anonymity.
Seeing the context in which the photos are taken (busy streets, kind of grimy looking) makes you realize how powerful a camera is at creating an illusion. That is, taking a good photo means composing your shot so that you don't include elements you don't want in it. What you end up with is a photo which makes it appear you're the only one at a location, or that a location is more beautiful or exciting than it actually is. Your brain fills in all the details of what's outside of the frame, which often isn't what's really there.
Exactly. “The photographer’s eye”. Photography isn’t much less art than, say, painting. Not much different than writing a novel, reporting on a court case or sharing your version of events to your marriage counselor.
Some people put a lot more effort into their photos than I do! Although as they selected people with 100k+ followers I guess taking good photos is basically work for them (or rather their patient partners).
There's something about it that's so undignified, and cringe inducing to me. To see someone pose, repose, inspect, repose, etc. I just could never want to put that much effort into how I'm appearing. The faux spontaneity is just so off-putting.
I think it would be really good educational material. Showing the silly dance that is performed removes all the glamour from the result, and might help alleviating the bad psychological effects that glamour is causing
This is like saying revealing that pornography is a studio act will turn away its viewers. All but the most naivest of naive followers would know this is how much time it takes to take the pictures that look as good as they do.
I was actually hoping to see real juicy stuff (maybe reflectors? Or them smoking something after?) but all seemed pretty tame to be honest.
> This is like saying revealing that pornography is a studio act will turn away its viewers.
No, it's like saying that revealing pornography is a studio act will cause them to see how little it resembles a normal, pleasurable sex act, and that they may decide afterwards not to hold it up as a standard of comparison for their own sex lives. You've added the part about "turning away viewers." People know movies take a lot of production and love them for it. They don't judge themselves for not being able to fly, or outrun an explosion.
Lots of people with popular social media presences do “behind the scenes” videos showing how much effort goes into their posts. They always get big spikes in followers afterwards.
Their followers know it’s an act, that’s what they’re there for.
Seeing how the sausage is made almost always results in disillusionment regardless of the industry.
Some of the magic was taken away when I learned that certain scenes a movie I liked required dozens of takes: it felt too mechanical/industrial and went against my romantic notions of how art is made. This is not far off from the pose-repose-inspect cycle you noted, but has a much larger supporting crew (and budget) and far longer hours.
Most art is the result of numerous trial & error. Most finished works of creativity/art, be it visual, music, poetry or prose, are preceded by numerous sketches, drafts, discarded attempts and edits. Creativity is a process and requires work. I think this is important to know as so many people think that they aren't artistic or creative as they can't immediately create something wonderful. But those that can seem to create spontaneous art have practiced a lot to develop that skill. It is possible to practice creativity in order to develop more.
How you stitch each of those together into a seamless ever-changing story *is* the magic, especially when you know how it's done.
Everything happens in tiny almost imperceptible 1/24th of a second steps, and yet eye-blink instant can make the difference between you believing a scene and not believing it.
Next time you watch a film or a TV programme, try to see how the trick is done. Watch for how people are positioned so that their eyelines cross when they're talking. Watch how when you cut from one shot to another, there's a movement in the shot that you leave, that matches the shot that you enter. This fools you into not "seeing the join". As you turn your head to look around you blink subconsciously, so your brain is used to dealing with cutting between scenes. How you do that to tell a story is a phenomenally powerful and beautiful art.
> How you stitch each of those together into a seamless ever-changing story is the magic...
Unfortunately, it is a lesser magic to the one I naively believed in: that of genius directors who have fully-formed visions they bring on screen, and world class actors who need 2 or 3 takes tops. It was all very idealized, and in my defense, I was very young.
Now I enjoy deconstructing tools of trade on some movies on a second watch: paying attention to camera angles, composition, color grading and sound. The only thing I regret is learning about the Wilhelm scream, because it always destroys my suspension of disbelief, and I hear it more often than I'd like.
Traffic cameras are publicly available in London. The first thing I used to do when coming to the office each morning was to look myself up in the traffic cameras along my journey (historic footage is also publicly available).
What the actual fuck. This is actually frightening. I can watch anyone, anywhere, anytime in London, anonymously, without them knowing? I guess I have to add yet another city to my never-visit-again list. No wonder schizophrenia is more common in developed countries, you ARE being watched.
Remember that this is still 2022. People think there's no privacy anymore, but in a few decades we are going to look back and wish we could go back. Of course these are public spaces where privacy has never _truly_ existed, but having cameras transmitting your every step to anyone with internet access is chilling.
Many state department of transportation groups in the US have camera on roadways all alone their respective state online. They're usually fairly low res, probably to reduce costs and data as they only need to check traffic flow, but they're often public. Check your states DOT website and look for traffic, there could be a treasure trove of cameras.
This reminds me of an old 4chan "prank". There was an open camera somewhere in New York(?) and in the view was a shop with various display stands. Users would post something like "Im gonna do it, watch!", everyone would start watching the video feed and like half an hour or so later a kid would run up and knock over one of displays. They'd just fuck with the poor shop owner whenever they were bored.
Considering the open camera "feeds" have the bounding boxes around people, I guess they're CCTVs mistakenly put on the open Internet, findable through some services (can't remember the name of a famous one).
The Temple Bar (Dublin) actually makes this publicly available. They also have a feed of the inside of the bar so that you can virtually experience Ireland.
I was scared at first too until I realized we're talking about people already taking photos for the purpose of posting to Instagram anyway. It's involuntary surveillance while participating in voluntary surveillance.
It's only following people who posted their own photos voluntarily in this particular implementation, but consider the next incremental step for this same approach: a website where you upload a photo of anyone's face and get a map/trail of every public photo/video posted by someone else where that face appears in the background for the last month. Now you're not just finding the subject of a voluntarily portrait in security camera footage from a public place, but finding anywhere that any private person has gone in any populated area where anyone else is taking photos for social media.
How many involuntary social media photo and video backgrounds do you think someone living in NYC or SF is identifiable in every day? I would venture to guess, enough to track a lot of their life. The only thing I see stopping anyone from making this site today is the challenges in scraping large quantities of public data from social media sites. Once you have the data, the rest seems like a solved problem.
The real problem is things like gait recognition. Most people already cover some or all of their hips, legs, feet, etc.... But it's hard to hide the way that you walk, regardless of what clothing you wear. And even if you can fake it for a short period of time, it's hard to be consistent about how you fake it over a long period of time.
When you can accurately identify the person who is completely covered head to toe, then it becomes a lot harder to hide.
You might enjoy "the green leopard plague" [1], it's a sci fi novella and (mild spoilers ahead, pause reading if you want to avoid them) it describes a future where image processing like this is a bit more common and accessible.
Well, some of the things shown in the film were already reality at that time. Only that one was called a conspiracy theorist, if one has noted this at that time.
What makes an "influencer" less deserving of privacy in the dimensions they haven't shared? Furthermore, what makes someone an influencer, when this technique can be used on smaller accounts, too.
Not everyone on insta publishes where they took the photo or wants it to be known. But if someone with an AI watches every camera in the world, they could easily find lots of people.
This is a mixed bag of both good and bad. Imagine a domestic violence victim being tracked/stalked by their abusive partner. Conversely, allowing citizens to monitor these cameras could potentially help prevent/solve crimes. In the end, it's probably a Constitutional dilemma that will find it's way into the high courts someday soon.
To detect faux-lean people you can understand that the angle shot has evolved to being from a low angle to the ground
And also wearing black should immediately be a flag for further scrutiny
It is the first suggestion for slimming attire and it is for the optical illusion for their own confidence and having other people want to engage with them
As if there is some awareness that everyone stopped saying anything about it but still aren’t personally interested
Many (most?) photos have EXIF tags with timestamps showing when the picture was actually taken, regardless of when it was posted. But given the amount of post-processing that likely happens to Instagram photos, I would not be surprised if that info was stripped out entirely, or at least altered.
Due to a massive rise in doxxing and stalking that occurred after 4chan made checking EXIF data an extremely common tactic when doxxing people - complete with "how to" guides for less tech savvy individuals - most image hosts and social media sites nowadays strip EXIF data from uploaded images. Some sites will detect/extract certain data only (eg: to include the camera/lens used for photography-focused image sharing sites) and scrub the rest. Been that way for roughly a decade if not longer now.
We have no idea as to the reliability of the system. We don't know how many false positives were found. We don't know if these were just hand selected from date/time/location.
Can someone explain to a novice how “AI” is used here? Is the person in the IG image being correlated to a security camera time stamp with face recognition or similar?
It is using computer vision to compare the instagram photo with the recorded 'open camera' footage to give the best guess at a timestamp. You can see the CV drawing boxes around the objects it is identifying in the footage on the right side, and it already knows that the photo was taken at the location (since the location tag was used that to scrape the photos), so it is basically just hunting for a person that best matches the subject in the instagram photo.
The instagram photos are most likely not posted exactly when they were shot but some time after. Also, even if they post a rough location there is no guarantee that they would show up on any particular cam. I'd image the ai has to go through a very large amount pictures and cam footage to find one where the influencer shows up on a cam some time before it was posted.
Instagram surely strips the camera metadata from the image served to the viewers, including the time taken. Imagine the outcry if it "leaked" the geolocation of a teenager's bedroom.
How is this any more creepy than any of the other people in those public locations videoing the same thing? Every day, you are recorded by hundreds of cameras, and a lot of those cameras are privately owned/operated. Specifically, I'm talking about people's smart phones. No matter where you go, just take a look around you, and you will probably be in someone else's camera frame.
There is no expectation of privacy in public. This just drives it home.
Both are really creepy. It becomes more creepy when you go from a recorded video that no one really watches to one systematically analyzed with AI though.
if only "no one really watches" was strict. if that video is one that gets posted to a social, but then rarely watched (by humans) after, it is still now available for all of those evilScraper types to pull down that content and do whatever they feels. those are the creepy bastards.
knowing that the three letter agencies can do this same forensics to catch $badGuy is somewhat comforting that at least it can do the function it was meant. preventing everyone else from doing the same is where the wheels fall off the bus. it's another one of those situations where the intended purpose gets shoved to the side with unconsidered purposes becoming the norm which makes $tech look evil.
It's creepier because it's connecting it to real life people's accounts, meaning anyone can then trace that person in the video and find out so much more about their life or stalk them etc.
It depends on the bitrate. Let's assume 10 mbps for five cameras. That'd be 16.4 TB for a month's footage.
Depending on the application, you could lower the bitrate significantly during times of low motion or low detail (such as at night). You could also drop the framerate.
I've found that when the scene is static, without panning, and no wind blowing leaves on trees, you can compress really well. I have a Wyze outdoor cam looking at the front yard that does 1080p at a little less than 1 mbps.
These are probably privately owned cameras, which are far more widespread, i don’t think governments have what meta has on you.
I’d go so far as saying, hand wringing over government surveillance has served as a smokescreen over the proliferation of private surveillance.
Not that i want to advocate for government surveillance, just that, in my experience, people who worry so much about it seemed to have entirely missed how much private companies have expanded into public space.
Private surveillance IS government surveillance at the end of the day. Government doesnt need to build an all seeing eye when they can just buy the data.
This is horrifying. Insta should require government ID to view so if you stalk / kill someone the police can quickly narrow down the set of possible perpetrators and find you.
Do you use a credit/debit card? Do you pay utility bills, rent or a mortgage? Do you carry a cell phone? Do you drive a car? If you said yes to any of these (and many more!) then you are not invisible at all.
https://radiolab.org/episodes/update-eye-sky
The company in question:
https://www.pss-1.com/