Another critical shift in the last ~120 years is the ease with which private information can be productized.
I say 120 years because I believe the Kodak Brownie No. 2 was the inflection point. Other film cameras existed before that point, but did not have user-replaceable film - you would return the whole camera to Kodak and they'd send you back prints and a new camera.
Thus, before the Brownie No. 2, the camera's owner had never posessed the negative itself. For the first time, an indisputably real account of an event could be reproduced over and over, on demand, with a low financial barrier to entry, and in any shape or size the negative's owner could imagine.
Obviously, the internet age has come with its own staggering paradigm shifts. Sure, private information may not be more sensitive than it ever was. But, it is easier than ever before to distribute it for profit.
My issue is this. Or, more specifically, I think the actual privacy invasion doesn't come from the use of cameras, it comes from the use of databases.
I honestly don't care even a little if I happen to appear in (or even are the subject of) a random photo taken by a regular person. I care a lot if I appear in a commercially-operated snapshot/video.
The difference is that a regular person probably isn't going to include that photo/video and related metadata in a database where it will be combined with the contents of other databases and weaponized against me.
Yes, I know they may upload it to Facebook and such, where it is weaponized. That does bother me, but I choose to ignore that part except to make sure that my friends and family don't do it.
> combined with the contents of other databases and weaponized against me.
what does weaponizing mean here? Because i do believe it is a hypothetical threat that most don't actually see as a real threat. The only scenario is that it acts as evidence for you being in a specific location/time, when it is inconvenient for you to admit to being there.
"Because i do believe it is a hypothetical threat that most don't actually see as a real threat."
In another of today's HN stories, 'For advertising: Firefox now collects user data by default', I made almost the same point. If a majority of the citizenry is indifferent and or sees no threat then the politics will not change, no new laws or regulations will be introduced.
Moreover, with no concerted opposition or action the status quo will only be reinforced—thus, by default, those who've vested interests gain additional political power to further ensure there's no change.
What I mean by weaponizing is that it's used to do things like target me for advertising, adjust credit ratings, set insurance rates, and even get a job with certain companies.
imagine if you're a closeted homosexual caught in a photo kissing another member of the same sex, and you live in a locality where that puts your life in danger. imagine you live in a theocracy and someone catches you without your head scarf.
THAT is weaponization.
Your examples aren't wrong but the stakes of losing control of our data are so much higher than you're imagining.
I saw a post on Reddit how someone in Kenya took a still image of 2 men kissing from an elevator camera video file and posted on social media for it to go viral
Seems dangerous
"I have nothing to hide" makes sense to me - except sometimes you don't know what you have to hide, or it can change after the fact
Absolutely, but I've learned from experience that the more extreme examples tend to get dismissed pretty quickly for various reasons.
The examples I used are ones that affect pretty much everybody and, more importantly, are directly relevant to me. Since I was commenting about my own personal situation, I thought it prudent to limit my response to that subset.
But you're right -- there are many layers to this onion, and some are far worse than others.
It amazes me that people think of adverts as benign. They are one of the largest industries on the planet whose entire purpose is to brainwash you into buying something you wouldn't
There's a reason they spend so much on things like facial recognition, they will take more money from you than they spend.
> I care a lot if I appear in a commercially-operated snapshot/video.
This is where likeness rights come into place. A commercial entity can't use your image as an endorsement of their product or in marketing materials without your consent, which is why many production companies err on the side of making sure they vet every face in every image of their marketing material.
> The difference is that a regular person probably isn't going to include that photo/video and related metadata in a database where it will be combined with the contents of other databases and weaponized against me.
As you said in the next paragraph, this is impossible to prevent without stripping people of their copyrights to use their work however they please, including by sharing on social media or uploading to a extremely value-oriented service (like Google Photos) that only provides their services for free/cheap because of the value they get from being able to "improve the product" using user content.
I don't see how likeness rights apply. I'm not talking about my image being used in marketing materials.
> this is impossible to prevent without stripping people of their copyrights to use their work
That's not a copyright issue. Copyright allows you to restrict how others use your works, it says nothing about how you use your works.
I think you're talking about free speech rights here, and free speech rights have never been, will never be, and shouldn't be, absolute. There are many cases where those rights (like all rights) need to be balanced against other contradictory rights.
And some countries at least pretend that privacy rights are stronger vs. the right to publish photos that are not marketing/advertising although I'm sure those rights are broken many thousands of times a day. The US does not except in very specific circumstances--including for said marketing/advertising or circumstances where you had a reasonable expectation of privacy (or misrepresentation).
That depends on what your concern is. A half-naked photo (which wouldn't bother me, but for the sake of argument...) might be more damaging short term, but the other, when combined with all the other data collection, is more damaging long term.
If you think of the processed film strip as a rudimentary database, a chronolgically organized record of images, you can see how it aided in productization. On the other hand, it's also clear that computers are eons ahead in that capacity.
Other than targeted advertising, how has private (I think you meant personally identifiable) information been productized?
The only examples I can think of is auto insurers buying your vehicle usage data, and creditors buying your borrowing data. I don’t know if cameras or the Internet has anything to do with either case.
Political parties buying your browsing/buying habits to hit you up with political ads. Police buying your location data to see whether your were near a crime. Insurance companies will soon be buying data about your exercise routines so they can adjust your rates accordingly.
And then there are the really creepy services. There was a service not long ago in Russia that would allow you to upload a picture of any woman you saw on the street. They would identify the face (via VK) and provide you with their contact and job information. Totally creepy. Totally abusive. But easily done with basic tech.
> There was a service not long ago in Russia that would allow you to upload a picture of any woman you saw on the street. They would identify the face (via VK) and provide you with their contact and job information
Seems like gender technically wouldnt be a blocker, was that a product choice, or a marketing choice?
The wiki doesn't say anything gender-specific, so I'm guessing that this is just the casual inevitability of any new tech than an official spin. Some will use it to enable harassment, and women are proportionately targets of harassment.
> Insurance companies will soon be buying data about your exercise routines so they can adjust your rates accordingly.
if this results in lower insurance for those who are deemed to be healthier due to exercises, then it's a good outcome. Pricing insurance more accurately is almost always a good outcome.
> if this results in lower insurance for those who are deemed to be healthier due to exercises, then it's a good outcome.
in reality, it'll result in higher insurance for those who are deemed unhealthy, or simply at risk. As someone said already, maybe joggers suddenly get lower premiums and bikers (especially in biker unfriendly cities) get higher ones. Maybe you end up paying even more if you bench more at a gym than the "recommended amount". Maybe they collaborate with certain fitness companies and will charge you less if you click an affiliate link to buy this health product.
The possibilities here are endless, and this is just for one aspect of life. This can in theory work with a benevolent leadership, but that ship sailed decades ago.
Insurance is about managing average risk for a population. The ones who end up not needing it necessarily paid more than they would have to, _in hindsight_.
Varying the premiums across the insured population adds some fairness to the equation, but the moment it starts actually culling members of said population through outrageous pricing, it is just plain cheating.
All insurance specialises in certain groups. Regular holiday insurance is cheaper than snowboarding holiday insurance (and regular holiday insurance doesn't cover snowboarding). Why would this be different?
Because if we nickel and dime people for ever decision we risk becoming a society adverse to risk. The safest holiday is the holiday not taken. The safest visit to a national park is the visit done online. If we want people to actually do things, participate in the economy and culture at large, we should not attach a usage tax for every little activity beyond sitting at home all day.
Because if you don't get snowboarding holiday insurance then you shouldn't be snowboarding but you can do anything else while you're on holiday.
Meanwhile in the US if you get kicked off your medical insurance with any kind of chronic illness then you either pay out the ass for basic treatment or die.
but you dont seem to care if house and contents insurance doesn't insure people whose house is currently burning.
The problem with medical insurance is that it's a flawed system in the US. Everybody gets sick, it's a near 100% chance guarantee, esp. as you age.
Therefore, medical "insurance" should not be insurance, but should be a fund. It should be paid into by all taxpayers at some rate proportional to their income (like VAT), and then the fund goes to fund _all_ non-elective medical procedures.
Insurance for medical should be for _premium_ facilities, like private rooms, private nurses etc. Not for the treatment.
> but you dont seem to care if house and contents insurance doesn't insure people whose house is currently burning.
That is the case. If your house is currently on fire, no one will insure you. What are you doing about this awful situation?
> Therefore, medical "insurance" should not be insurance, but should be a fund. It should be paid into by all taxpayers at some rate proportional to their income (like VAT), and then the fund goes to fund _all_ non-elective medical procedures.
You seem to be extremely US-centric. You can't differentiate between "how the US does insurance" and "fundamental truths about insurance". This discussion is not just a proxy for an incredibly well-worn, and separate, conversation on US healthcare.
That's the same as saying someone who's poor and low risk can't snowboard, because their premiums have gone up due to a richer high-risk snowboarder you're forced to sell subsidized insurance to.
No, it's saying someone who's poor should be allowed to snowboard just as much as someone rich, and currently they can't and that's a problem. You are advocating for poor people to not be allowed to snowboard, and if they do snowboard, for them to no longer be allowed to drive.
I suspect there are many activities that correlate to higher risk and many people would not like their insurance to come with a page of fine print of exclusions/fee adders associated with those activities even if it theoretically is more equitable.
As you say, while there's probably some correlation between cycling and healthy activity in general, it's probably also an activity--depending on local infrastructure and where you cycle--that by itself adds risk of injury.
Several car insurance companies in the United States (and perhaps elsewhere) have started pushing little devices that plug into your car's OBD2 port and narc on your for speeding, hard acceleration, etc. This is different from them buying vehicle data usage from third parties, because in this case they're "buying" it directly from the source, essentially for free.
The entire junk debt collection industry.
DeleteMe is a product that was made in response to the productization of PII, so that's an interesting footnote in this conversation as well.
Here is a good one. When one applies for a security clearance for a job in a large sector of the employment market or a gov job. All this data could be used.
Doctoring a photographic print is one thing. Doctoring a negative in convincing fashion is another. The only approach I can think of is a double exposure, but good luck pulling that off without it being obvious.
Photographic film is remarkable in that way - you can screw with the print, but the thing that's actually captured on the film isn't easy to manipulate once it's been developed.
Doctoring negatives used to be a very common way photographs were doctored. Think about it: you can iterate on the process and have a reproducible product at the end.
I can only imagine the trickery necessary to make this work but the historical record is clear that negatives were definitely altered.
I say 120 years because I believe the Kodak Brownie No. 2 was the inflection point. Other film cameras existed before that point, but did not have user-replaceable film - you would return the whole camera to Kodak and they'd send you back prints and a new camera.
Thus, before the Brownie No. 2, the camera's owner had never posessed the negative itself. For the first time, an indisputably real account of an event could be reproduced over and over, on demand, with a low financial barrier to entry, and in any shape or size the negative's owner could imagine.
Obviously, the internet age has come with its own staggering paradigm shifts. Sure, private information may not be more sensitive than it ever was. But, it is easier than ever before to distribute it for profit.