Because the general public doesn't care enough to raise a stink? Back when Snowden story broke, I tried explaining its significance to a handful of my friends - all I got was a yawn. These are not dumb people, they are smart professionals (non-IT). Even then, I failed to get the point across to them.
It would require some serious education before the public wakes up to the dangers of private companies running amok with their data. Sad thing is, it is already too late. It is going to be very difficult to put a lid on this. This is a company that we (now) know about - how many are there silently working in the shadows that we don't know about?
> It would require some serious education before the public wakes up to the dangers of private companies running amok with their data.
Exploitation is the best education there is. Everything is fine until it isn't. One day, companies and governments are going to start doing things with personal information that are unacceptable to even the average person. By then, it will probably be too late.
Case in point: The Nazis used census records to identify Jews. Who knows what pieces of personal information the next madman will consider relevant. They'll certainly have much more of it to choose from.
Is potential surveillance by government entities the primary reason you are against companies generally doing whatever they want with user data? Are there other rationales?
Just like your friends, I personally don't particularly care either, but from time to time I have tried to understand the privacy crowd's obsession with this issue and the rationale behind laws like the GDPR and CCPA, as well as the desire for even more restrictive laws and I truly don't get it.
Is there a manifesto somewhere I'm missing? Some essay or thinkpiece that lays out in detail the case against collection of user data?
Take any century in the last 2000 years of human history, and there are files about people. For long it was sculpted or hand written. Then it was printed. Now it's digital. But it's the same thing, only the scale and speed change.
And at any point during those centuries, somewhere in the world, some entity (it doesn't have to be the gov) does bad things with those files. It's different every time. Excluding, killing, tracking, stealing, controlling... The form changes every time, but it's the same thing: abuse of those files.
It would, of course, be less of a problem if the information access was perfectly symmetrical. If anybody could access anybody's data, society would probably have a hard time for a few years, then adjust. And maybe become more fair.
But that's not what's happening. Here it just reinforces power asymmetry. And it creates incentives with huge bias that affect everybody's life.
There are three reasons why people give your answer.
1 - We had a nice run for a few decades in North America and West Europe. It's been a sweet life. And the human mind sets it as a new baseline. Now people see this as normal, and something else as the exception.
However, those decades ARE the exception. An exception that needs maintenance to preserve as best as we can.
2 - We are already pretty bad at making the connection between our misery today and the consequences of our past lifestyle, but today's information system is making it extra hard.
There are several factors for this: those in power getting really good at PR, information overload, more levels of indirection between causes and consequences, and the whole system complexity that never ceases to increase.
3 - The convenience is huge, and the price delayed
We don't get tracked for free, we get huge convenience in exchange. Plus we don't pay the price immediately, nor individually. We pay it as a society, and since it's cumulative, it's not obvious how much it costs us. It will only be painful in ... Well nobody knows when.
In fact, not only doing things right would rip us from convenience, but we would individually pay a strong price on top, right now. While seeing everybody around not doing it.
It's the exact same problem than for global warming.
Not accepting tracking is a deep and important political decision that shapes the future of our entire society. It is as important as avoiding mixing the church and the state or defending freedom of speech.
And it's also why it's not a popular view: it requires to think about what society we want to build, and not just what life we hope to have individually.
Yes. The fundamental issue is the total lack of respect for the user's consent.
People usually have no problem with volunteering personal information that is relevant to whatever activity they're trying to accomplish. For example, a company will need people's addresses in order to ship products to them. This is a voluntary, explicit and respectful process: consumers willingly and knowingly give the company copies of the information they need to perform the service and the company uses that information only for its intended purpose and absolutely nothing else.
The problem we face today is that businesses are collecting massive amounts of personal information indiscriminately, invisibly and without true consent. Web browsers hemorrhage personal information without them even asking and there's no way to stop it. Companies make apps that mine people's phones for every last bit of data they can get their hands on. Web sites put up annoying little banners saying they collect data and call it informed consent even though there's no way to say no. They bury some clauses in a terms of service nobody reads and say the user agreed to it by continuing to use site even though cookies were set and fingerprinting was performed on the very first visit before the user could possibly have known about much less read the contract.
Not only that, the information is being abused to do things people don't actually want. When people give a company their email addresses, they assume they will receive messages that are actually important. What happens is the company thinks it has every right to spam people with marketing and advertising emails or sell their data to other very interested parties. People give a company their phone numbers and next thing they know they're getting marketing calls they never asked for and can't opt out of.
And then there's the security issues. If someone has information about people in a database, there's always a chance it can be leaked even if every precaution is taken. The potential for harm is significant. Data should be considered a liability for companies. Knowing things about people should cost them money. They should have ample incentive to collect as little data as possible, limit the scope and frequency of the use of whatever data they collect as much as possible and delete that information as soon as it is no longer needed. What's happening today is the complete opposite of that: companies are collecting as much data as possible, keeping it forever and using it for whatever makes them the most money regardless of people's wishes.
These examples are relatively benign but the potential for harm is always there. What if companies start buying up personal information and using the data to profile and exclude candidates? No doubt information such as browsing history would condemn a huge number of people. What if companies find a way to deanonymize that data and link it to candidates?
> Back when Snowden story broke, I tried explaining its significance to a handful of my friends - all I got was a yawn. These are not dumb people, they are smart professionals (non-IT). Even then, I failed to get the point across to them.
If you were unable to convince multiple smart people who you care about / respect, then perhaps it is time to at least considere the possibility that your position is incorrect?
Investigations are on the way. Heise online writes (translated): "Hamburg data protection officer takes action against Clearview. Following a complaint, data protection officer Johannes Caspar is investigating the US company Clearview AI, which specialises in automated facial recognition. [...]"
Legal actions are currently in the process of being carried out. But this take some time to clarify thinks.
Like part of the pictures come from old failed social networks which sold them and which in their AGB _might_ state that they can do so. Now it's questionable if such AGB is valid at all but as such it needs to:
- If European citizens are affected (it's hard to be be the case).
- the exact legal status as pictures might have been optained legally before GDPR
- ...
- also note Clearview stores biometric data (devices from the images, necessary or "fast" search/lookup would not be implementable
So I would not be surprised that Clearview will be required to delete all data from which it can not be sure it's not from EU citizens, which I think would mean all data given what they store and what they don't store. Obviously they won't comply and a EU wide arrest warrant might follow which is kinda useless if the person doesn't enter the EU. I highly doubt that they will try a international warrant.
So practically it's unlike that anything will change except the operators of Clearview being listed official as "potential" criminal (no arrest => no court => innocent until convicted)
GDPR is EU law and does not cover American corporations. It relies entirely on foreign cooperation [1] to extend that reach internationally, which so far has been untested and is not likely to get any real support in the near term given the current economic situation.
There is no arrest warrant for Google, Facebook or ad network employees either despite them violating the GDPR on a daily basis and very large scale.
Granted, the GDPR doesn't say anything about arresting offenders, but the companies should at least be investigated and fined, which isn't happening either.
If you search for Google, you will find one 50M fine. What is the 4% of Google revenue's again? 50M is filed under cost of doing business or the corruption budget if you are not that generous.
As long as they process openly available data I do not see a difference from searching your name on the internet in terms of the gdpr. They also responded to the request (maybe not fast enough) . Deletion would be a difficult thing maybe to be fully compliant to eu regulations .if its ethical is another story...
> PII may only be processed if you have explicitly consent for the exact purpose you want to use it for.
Not exactly. Consent is one of six allowed ways to process PII. It’s just that for advertising/tracking use its probably the only one that you can use.
Can you back your claim? Its true that the data is still personal but a lower level of protec to ion is applied. Except for children as i remember. There is need for notification but exceptions if not realistically feasible . Sorry on my phone with child sleeping in my arm.
Here is an article I could quickly come up with in English
How is a Facebook profile openly available? Facebook probably gets pii via unreasonably broad blanket opt-outs, which is itself problematic, then it is shared with / not kept safe from a third party.
Good thing I'm not relying on you for my legal questions. GDPR gives the user control over their own data, and each new use of the data requires specific consent be asked. So using PII on any person, no matter where you found it, requires explicit consent.
They are using PII of hundreds of millions of Europeans without written consent.
That alone should mean billions of euros in fines.