Personally, I think the GDPR cookie notifications have severely degraded the user experience of the web. I also think it's become just another "Terms and Conditions" - something that people reflexively accept because they want to get on with using a product.
Future regulations with teeth, but don't exist yet, are things like:
- Data portability: I should be able to export my entire Facebook account in full in a standardized format and easily transfer it to any other social network of my choosing.
- Algorithmic control: I should be able to choose which recommendation algorithms are being used on me, or opt out of algorithms entirely.
- Algorithmic transparency: As a consumer, I should be able to see something similar to what Facebook's growth team sees. What specific changes caused people in my cohort to increase their watch time by 2%? Was it better button placement, which I'd be okay with, or was it an increase in conspiracy content, which might not be?
All of these things might be huge overhauls and difficult for the social media companies to deliver, but if you are pushing content in front of billions of people's faces for hours a day you should have an equivalent immense level of responsibility.
I reject all interest reflexively. I think it's great. Not! It just shows me how much tracking there actually is... And since I delete my cookies every session, I have to do it every time.
I still wonder what the 'legitimate interests' of ad companies are. I also wonder if rejecting the cookies actually works, it's not like I can check right?
> Personally, I think the GDPR cookie notifications have severely degraded the user experience of the web.
The absolute vast majority of cookie banners are actually illegal under GDPR. One downside of GDPR isn't the banners, it's inaction by EU authorities which should've cracked down on these practices long ago.
1. Cookies essential for the functionality of your website (such as session cookies) don't need consent, and are explicitly allowed (you need to have an easily accessible clear-text explanation of what they do)
2. Pre-selected boxes do not constitute consent
3. You must provide a simple "opt out of all and proceed" button
4. You are not allowed to degrade functionality if user has opted out of non-essential cookies
5. You are not allowed to load any non-essential cookies before consent is given
What is meant by 4)? So users have the right to see a web site without ads? I think if users don't consent, you should be allowed to block their access to your web site?
What is or isn't PII, which is a US legal term, is irrelevant.
What matters is if it's Personal Data.
Personal Data is defined by the GDPR as:
"‘personal data’ means any information relating to an identified or identifiable natural person (‘data subject’); an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person".
I would say a tracking ID falls under "an identification number [or] online identifier"...
But to the rest of your post: I don't think so. I think there an identification number is something like a government-issued ID number.
An online identifier would have to identify you (e.g. my Hacker News username is probably identifiable).
The way I think of it is if someone who isn't authorised to know who I am can look in a system at the number and then go off and correlate that info to find me without further reference to other data in said system.
A database ID doesn't count, because you'd then need to look up something actually identifiable in the system to figure out who I am; neither does an opaque tracking number.
My social security number is identifiable; my email address may be identifiable; if I gave birth in region X to octuplets, then that probably is too.
If your basis for processing private data is consent, then under GDPR one of the conditions on consent is that consent has to be freely given, it can't be traded for something.
In essence, under EU law privacy is an unalienable right, it's not something that can be freely contractually sold away (alienated) by the users. If you have a contract where users agree to allow you to do whatever with their data because you give them $100 or show some content, then that does not fit the definition for consent according to GDPR, and this contract does not - can not - give you the right to process their data as you wish; that particular clause in the contract is effectively void, the users are "selling" something they can't legally sell.
If some data is required to fulfil your contractual obligations to the user (for example, processing their address to deliver pizza), then that is a legitimate use under GDPR 6.1.b which does not require consent, but if you'd want to use the same data for some other purpose (for example, using that same address for targeting advertising or giving it to a third party) then the contractual need clause 6.1.b wouldn't apply, you'd be stuck with 6.1.a (consent) and that is valid only if it's a genuine free choice without some benefit or service being conditional on providing "consent".
So you technically are allowed to block access to your site to people who don't click a checkbox "I agree to stuff", however, if you do so then clicking that checkbox does not constitute freely given consent, so it can't give you any rights to use the data for any of the people who checked that checkbox, for the purposes of GDPR that checkbox is simply meaningless if access to your site was conditional on it. So the users have the right to (and will) file complaints about illegitimate use of their data right after clicking the "I agree to stuff" checkbox.
Additionally, as I only recently found out, it's common practice in European Law to have explanatory notes in the law itself. They are called recitals.
1) Cookie notifications are not required by GDPR. They are required by national level implementation of the ePrivacy Directive.
2) Data portability is already a requirement under GDPR Article 20. The problem is that it requires competing services to work together to create interoperable systems or formats. That's really unlikely to happen without addition regulatory action mandating that competitors cooperate.
People will always gloss over privacy policies, privacy tools, and privacy disclosures. The reality is that people are concerned about privacy, but not concerned enough to make any changes. People concerned about the environment still get on planes and fly all over the world.
Future regulations with teeth, but don't exist yet, are things like:
- Data portability: I should be able to export my entire Facebook account in full in a standardized format and easily transfer it to any other social network of my choosing.
- Algorithmic control: I should be able to choose which recommendation algorithms are being used on me, or opt out of algorithms entirely.
- Algorithmic transparency: As a consumer, I should be able to see something similar to what Facebook's growth team sees. What specific changes caused people in my cohort to increase their watch time by 2%? Was it better button placement, which I'd be okay with, or was it an increase in conspiracy content, which might not be?
All of these things might be huge overhauls and difficult for the social media companies to deliver, but if you are pushing content in front of billions of people's faces for hours a day you should have an equivalent immense level of responsibility.