In 2018 I built a privacy-friendly open-source consent manager (https://github.com/kiprotect/klaro) which is used on many websites across Europe. From the beginning I never liked the IAB, didn't implement it and told people that I regard it as unlawful since e.g. a user cannot possibly make an informed decision that involves thousands of third-parties. Still many of our users kept asking me about it since it was "the way" to become compliant.
So I finally asked the IAB how one could potentially implement their framework as an open-source framework. Their answer was basically that it's not possible. You have to register as a CMP provider and ensure that your users are using your software in their compliant (ha ha) way, which is of course impossible to enforce with an open-source software that everyone can self-host. In general, in my opinion the IAB is mostly a framework to shift liability from the advertisers who steal the users' data to the publishers and CMP providers. Therefore I'm quite happy we never got around to implement this "feature" in our CMP, and I hope the IAB will quickly die and takes all those alibi CMP providers down with them.
I like that your solution actually handles loading the third-party libraries and only does so after consent has been given, so that if a user opts out absolutely no data (not even a DNS lookup) is sent to the third-party.
A lot of consent management solutions appear to load the third-party scripts regardless and only focus on cookies, even though the real danger is IP-based tracking and browser fingerprinting which doesn't depend on cookies or any persistent data being stored (they've adapted as modern browsers heavily restrict cookies).
I am a user of klaro.js and just wanted to express my thanks for the work you do.
One question, that arose though is how one would fulfill the requirements of GDPR regarding the logging of consent as a kind of 'paper trail'. As I understand the requirements one would need to store some Form of identification (like an arbitrary ID, the time and scope of consent and also store this for the user or on their machine). I understand how this could work for email opt ins. But consent on a web page?
Thanks! So documenting consent can happen directly in the users' browser, this is also GDPR & ePrivacy compliant. Those legislations don't require server-side storage of consents, it's another myth propagated by CMP providers to sell subscriptions.
Storing consent server-side only makes sense for identified users (e.g. those that are logged in on your site) as there you actually have something that you can link the consent to. For an anonymous user that e.g. has a Google Analytics ID stored in the browser you'd have to store a link to that ID on the server-side as well in order to link it to the consent, and that is not privacy-friendly. Storing IP addresses also isn't a good idea as you're again creating more privacy risks for the user than necessary.
The key is that the data controller be able to demonstrate AND RECORD that consent was received. If I clear my cookies, how does data controller prove consent?
“keep a record of consent statements received, so [the controller] can show how consent was obtained, when consent was obtained and the information provided to the data subject at the time ... [and] also be able to show that the data subject was informed and the controller’s workflow met all relevant criteria for a valid consent.”
With that guidance in mind, and from a practical standpoint, consider keeping records of the following:
The name or other identifier of the data subject that consented;
The dated document, a timestamp, or note of when an oral consent was made;
The version of the consent request and privacy policy existing at the time of the consent; and,
The document or data capture form by which the data subject submitted his or her data."
Just seems like some huge liability here if you didn't record the required elements in a manner that allowed you to produce them. Does GDPR allow me to requisition my users devices if I'm investigated?
The consent and the data collected via the consent need to be linkable. That's why it makes sense to store consent records for identified users on the server-side, because you "know" the user in that case.
For pseudonymous users, e.g. those you track via a Google Analytics cookie you don't know who the user is and you (hopefully) can't reidentify them without the Google Analytics cookie. Since the cookie is stored in the users' browser it makes sense to also store the consent record there. If you would store that consent record on the server-side you'd still need a cookie in the users' browser to link the consent record to them.
> The consent and the data collected via the consent need to be linkable. That's why it makes sense to store consent records for identified users on the server-side, because you "know" the user in that case.
Yup, this is why a lot of websites try to lure you into logging in to the website to enjoy the full content (they won't tell you this is the reason, of course).
Thanks for the answer. Not sure if I misunderstood the GDPR but I thought it had the requirement to be able to provide the consent documentation for any identifiers used (like Google Analytics ID or Matomo ID).
The user has the consent documentation on their device. But I can't provide the documentation myself.
I actually don't think the risk is very high. And I agree that storing this information on my side is additional data privacy risk.
Exactly. If you read the cases, a fair number of them are gotcha type things. For example, ask yourself, if google tried to say they couldn't provide consent records (despite CLEAR language in GDPR) because they are only stored on devices - I have a feeling a MAJOR fine would be inbound.
But we get expert advice here that it is fine.
One of the claims I saw was that google hadn't said something about information being used to advertise - but when I read the related disclosure (they have all the versions) it seemed clear enough to me. I'm not saying ruling was wrong, but they in some cases hinge on issues just like this.
And we are just scratching the surface of things here.
If you clear site data for a site tracking anonymous consent, you've cleared your consent. No records necessary unless you are linking consent to user accounts stored on your backend.
Yes, years of computer use have conditioned users to just click the box to make the dialog go away. Add to that the minimising language and "make our products better" language the form's presentation is couched in and the dark patterns to make opting out hard, I think most users accept.
Looks great design wise, but I still see some dark patterns like making the accept button green, and decline grey
It's still preying on the psychology of users that have been taught for years that green = good, accept, happy path, things will work
If you look at something like Apple's consent which as "Ask app not to track" and "Allow tracking" (can't remember exact phrasing), the binary choice is presented in a fair and equal way which makes you actually think about what you're pressing, because there's no clear "right" choice they want to you to press unconsciously
Edit: I understand though, you have a paid product, you boast about your acceptance rates as part of your marketing strategy, and no company is going to pay for something that decreases their ability to track users.
I've been in the same position as a developer where I'm asked to implement the maximum amount of obtrusiveness to coerce people to accept tracking, like overlays with the famous 'body { overflow: hidden }', because our marketing departments start to go ballistic when they can't track every single users every move. It just makes me sad sometimes that this is what we're dedicating our time to.
We don't boast about acceptance rates anywhere on our website, or claim that we optimize them in any way (which we don't). In my opinion having differently colored buttons makes declining easier for users as they have more visual clues to distinguish the choices. Most users are accustomed to a green "yes" button and a grey "cancel" button from system dialogs (though there are many variants these days), so they don't have any trouble finding the "cancel" button. Acceptance rate for Klaro on our website is around 50 % (33 % when it's on HN :D), which is much lower than what other solutions claim to achieve using dark patterns.
Personally I often click on the wrong choice when e.g. asked about giving a website access to my location on iOS, as there's no color difference between them and the texts are quite similar as well.
BTW Klaro is full customizable so users can change the styling, and some users choose to display the buttons in the same size and style, though I don't think this influences acceptance rates too much.
> Looks great design wise, but I still see some dark patterns like making the accept button green, and decline grey
Since all the other consent forms are like this it makes sense to not change the established standard. I would for sure misclick, since I've by now gotten this dark pattern ingrained (I automatically go for the grey button).
I really hope they actually issue a hefty fine here, instead of just a "now stop doing that".
Not issuing a fine would send a signal that simply ignoring the law until you're told to follow it pays off, since the companies involved certainly made much more profit in that time than they would have made had they followed the law.
Also, the fines should be issued to everyone involved in this mess - middlemen and library providers like IAB, the ad companies actually collecting the data, and most importantly the publishers that sent their visitor's data to them.
> Not issuing a fine would send a signal that simply ignoring the law until you're told to follow it pays off
That’s how capitalism works. For better or worse. If as a person you break the law, you are prosecuted and punished, without consideration who you are and what you contribute to the society. It's very easy to be erased from society (lifetime sentence) and/or lose lifetime earnings (via huge fines, compared to your income potential).
For corporations it's totally different. Consideration of who they are is a huge part of punishment. Countries don’t want to kill/severely injure companies, especially big ones, as they worry about fallout effects to their whole economy. As a corporation you can commit much much bigger crimes and get and equivalent of a parking ticket in terms of impact.
Exactly: legitimate interested is an actual thing, but it doesn't cover > 500 third parties which is what I have found in these boxes (I selected all and processed them in Libre Office or something).
In the last 3 months some "innovative" CMPs have also added the feature to have to click into a more info box per purpose to find the legitimate interest checkbox to untick.
The lawful bases for processing data are listed in Article 6. Subject consent is 6(1)(a) and legitimate interest is 6(1)(f).
Consent is used when processing is not required to provide the service, no other basis applies, the processor still wants the data anyway, and the subject agrees. Legitimate interest applies when data isn't needed to provide the service but is necessary for related business functions (like fraud detection) not covered by another basis.
No basis except 6(1)(a) requires the subject's consent because the rest all cover processing required for the service to work.
Edit: I read "They're not" as "the bases aren't separate." On rereading, it could also be the interests aren't 6(1)(f) legitimate. Which, agreed, they're usually not. I'll leave this up in case it's useful to someone.
Legitimate interest being opt-out makes sense, for example for fraud prevention. I absolutely agree though that adtech often blatantly claims they would have a legitimate interest, whereas they should ask for (opt-in) consent.
No, it doesn't make sense, because if it would be legitimate interest then they wouldn't have to ask for consent. That's the very point of legitimate interest. You can may that they are being nice, but it's obviously not the case and also would make no sense. You either claim that you are collecting data because it's your legitimate interest or you ask for permission. What they are hiding here is what you say: make some of their cookies opt-out.
They dont need to ask for consent for legitimate interest, that is why they are opt-out (default on) as opposed to the consents which are opt in (default off)
An opt-out for legitimate interest makes no sense. You either need it for a purpose or you don't. A typical legitimate interest is fraud detection. Being able to opt-out of that would defeat the purpose.
"The Interactive Advertising Bureau (IAB) is an advertising business organization that develops industry standards, conducts research, and provides legal support for the online advertising industry. The organization represents many of the most prominent media outlets globally, but mostly in the United States, Canada and Europe." From Wikipedia https://en.m.wikipedia.org/wiki/Interactive_Advertising_Bure...
Am I the only one who thinks this is all a bit mental? Say you have a (real, physical) business. And say you have some customers who are regulars. You can use your eyes to see what your customers frequently buy/engage with and your brain to remember. You can also use a notepad to write things down. This is commonplace - "The usual today John?".
Now say John travels to another town, and the proprietor of a similar establishment in that town, wanting to provide John with the best level of service, calls you to ask "Hey, what does John like?", and you tell them.
Now we just supplement your eyes and notepad with technology, and replace phone calls with packet exchanges. What has changed exactly? Don't you have a right to record who transacts with you? Isn't that information yours, to do with as you please? Can "John" command us to forget we ever saw him? Where in this sequence are anyone's rights violated? How is any of this reasonable?
> Now say John travels to another town, and the proprietor of a similar establishment in that town, wanting to provide John with the best level of service, calls you to ask "Hey, what does John like?", and you tell them.
That was already way too invasive, yes. You could probably get away with it before, because the scale was so small that it wasn't quite as horrible. There is after all a difference between occasionally violating one person's privacy and violating the privacy of thousands of people a minute. Not an actual ethical difference, but there are finite legal resources to go around.
Also, I object to this idea that because the pieces of behavior are acceptable the combined effect is automatically okay. It's perfectly legal to own and use a camera. It's perfectly legal to own an use a telephoto lens. It's even legal to look at your neighbor's house. Nonetheless, taking a camera and pointing it through a telephoto lens at a neighbor's house and recording 24/7 is an excellent way to get arrested.
All of what we're discussing is within the realm of decency, not law. Yes it's impolite to gossip about people, always has been. It's however tyrannical and totally inappropriate to legislate decency.
"decency" is not definable, that's why it's tyrannical to legislate about it. Laws serve to establish a framework merely for peaceful coexistence. To use them to try to make everyone decent is surgery with a sledgehammer.
I'd actually challenge you to find a law in society that doesn't have some origin in vague decency - the legal framework built above decency is something we expect to be rather stable over the long term but, at a really basic level, we have a law to not murder people because we really don't appreciate it when people do it - not because there is some natural law inscribed in our DNA stating that murder must be a crime.
Laws conform with societal ethics and those ethics absolutely change over time and are never uniformly agreed to by all individuals.
Lastly - scale absolutely does matter when it comes to laws we want to enforce. We don't want to enforce a no-gossip law because the invasiveness of enforcement would be unbelievably deep, so decency is there to tell you that while you won't get arrested for doing a thing you should feel guilty about it - most members of society get equipped with this guilt during their upbringing and so criminally malicious gossip is a problem we mostly ignore at a societal level.
Laws are absolutely BS in their inherent nature and a construct of society that could easily shift radically with large political shifts - but that doesn't mean they're invalid.
We have laws not to murder people because the state assumes a monopoly on violence. In absence of retributive justice we did and do continue to murder each other in blood feuds that last generations. This is perfectly normal human behavior. The state is there merely to provide a framework in which people who hate each other can be expected to surrender their natural right to use violence.
To me that's where the right of one group (the government) to legitimately use force ends. The sledgehammer that is the legal system should only be used to administrate the peace.
Beyond that, the only concept that can really be impartially measured is liberty. How free are you? In the graph of all possible actions, which are legitimately available to you? Everything else is an attempt to define good or evil, or right and wrong, and is therefore some form of religion. In a world where we are all equal, have no oracle to discern good from evil, and disagree diametrically, the only reasonable thing to do is optimize for liberty, and let everyone figure it out for themselves.
Scale does matter, but not in regards to rights. If you have the right to do a thing, you have the right to do it a million times. If you have the right to write something down, you have the right to keep it in a database. If you can publish your letters, you can do so over HTTP as well. This isn't a new conversation. People have been publishing memoirs of their private correspondence since the printing press. What disturbs me is how we seem to be shifting our consensus that they have the right to do so.
So, you’re using Max Weber’s theory of the state in a contradictory way in my reading. Weber’s theory of the state as being the society which derives its authority from a legitimate monopoly on violence was contrasted _against_ the very “natural liberties” that you seem to espouse (e.g. the rights of peeping toms, the rights of private shopkeepers, and the limitation of the government). Either you argue that the legitimacy of the state stems from its legitimate monopoly on violence as a community (thus allowing it to dictate “decency” as you say) or you argue that the legitimacy of the state derives from the citizenry’s own natural rights to self governance and liberty.
I'm not strictly arguing for Weber. I'm basically proposing that beyond majority consensus, we have no way of determining that the peeping toms or the discriminatory shopkeepers are in the wrong. In combination with the problems of majority consensus (if a majority agree to murder a minority, is that okay?), the presupposition that we are all equal, and have a right to exist as and where we are, the only reasonable conclusion is to enforce the absolute minimum of law required to maintain the peaceful coexistence that differentiates civilization. This is doubly true in a multicultural society.
> We have laws not to murder people because the state assumes a monopoly on violence.
This theory utterly fails to explain laws against murder from the millennia when there was no such thing as a state that claimed monopoly on violence. In the feudal system there was no such thing as a unified state entity, it was just a bunch of people invested with certain rights organised into nested hierarchy of fealty, no monopolies there, but still they had no problem ruling people guilty of committing murder.
> If you have the right to do a thing, you have the right to do it a million times.
You've never seen those signs that say you get only one free cup of coffee have you?
The source of legitimacy of feudal monarchies was divine right. This is not the case today. Our governments derive legitimacy from the consent of, and in defence of the liberty of, the governed. This is why I object to these laws. They only constrain my liberty to interact peacefully with other people, and I certainly don't recall consenting.
> You've never seen those signs that say you get only one free cup of coffee have you?
Indeed I've seen the signs. I would very much object to similar laws.
You are free to leave if the governed collectively have decided on laws you don't agree with, and they'll probably ask you to if you go on about your demand to be granted veto rights on all legislation.
As an aside, I'm slightly curious how you explain the legality of said sign using your absolutist libertarian framework.
"free to leave" isn't really an excuse for tyranny is it?
The legality of the sign is derived from the fact that it's on private property, from which the owner has a right to expel anyone for any reason. Violating the rules of the sign is simply grounds for expulsion from said property, not criminal consequences. That's the difference between a rule and a law.
The state needs not have a monopoly on violence. Dueling was outlawed in France in 1626, and early modern France was certainly a state. The dueling laws only became necessary because our mores on violence changed.
By whether these laws enforce negative rights or positive ones. You should have a right to do whatever you want as long as you're not directly interfering with the rights of others to do the same. That's what it means to be free.
The demarcation represents a minimum set of restrictions on behavior which fundamentally differentiate civilization from its opposite - namely the presupposition of peaceful coexistence. From there, there is no means to derive the degree of compulsion to cooperate, so better not at all.
All of those things are basically religious terms to me. Some people subscribe to the religion that they have a "right to be forgotten" or a "right to their data". I don't subscribe to this religion, I want out.
You can lobby for the law to be changed. Don't be surprised you can't get a critical mass for "Yes, I'd like businesses to be able to track me".
You could just as equally declare copyright law or property law to be a religion and insist you don't want to subscribe to it, but society as a whole does, and so if you want to participate you just have to lump it.
What if the laws were the other way, and business were being mandated to track everything they can about the people they interact with? How would this tyranny be different from the one existing currently? Just because the majority decides something, that doesn't mean it's not tyrannical. People should be free to peacefully interact with each other as they individually see fit. Some people may track, some may choose not to, advertise that fact. Any picking of sides is totally arbitrary and therefore unjust.
The European union has evolved from the European Economic Union, and they are generally pretty pro-business and pro-freedom.
One thing you want to keep an eye on that is actually harmful to business (and economic freedom) is if some part of the economy gets caught up in a race-to-the-bottom. This is a situation where everyone's best move is to defect, and then defect again, all the way down until no one is free/making a profit/happy/etc anymore.
In this case there is a clear race to the bottom situation where everyone ends up having to track everything about everyone, just to keep up with the competition. As this would be pretty much truly be a panopticon-like tyranny by any other name, that's clearly not desirable.
In this case the EU is trying to make legislation that stops the race from going all the way down, by leveling the playing field so that everyone can conveniently say "see, we can't go further because it's actually illegal now".
So the net (long-term) freedom is actually increased by this measure.
Is it not tyrannical to prevent you from selling copies of Lord of the Rings without paying the author by that same logic?
If the majority of society was in favour of tracking everything then the opponents of that would have the same options of lobbying to change people's minds. As seen in the difference between end user reactions to GDPR vs ACTA, that seems like it would be much easier.
Yes? Copyright law was widely regarded as tyrannical when first introduced. Anything to which violence is an inappropriate response should not be legislated.
Because once you've told something to someone it's not yours anymore? Sure you can sign an agreement, but the consequences of violating it are civil, not criminal.
During WWII in Europe, numerous people in occupied territories learned that it is very important to be able to retain your privacy and ensure people don't learn too much about you. While people of Jewish ancestry are often taken as the canonical example, many other groups of people and individuals learned that "I have nothing to hide" is most definitely never true.
Would it surprise you that it is in fact Germany that now has very strong privacy protections these days?
Decency works for managing personal behavior on a small scale, with an ultimate check of free association. Commercial surveillance invalidates both the assumption of scale and the ability to opt out.
In general it seems like you're just asserting that value judgements should be scale free, while ignoring qualitative criticisms.
Erm. Are there any laws that you're okay with, then? "Murder is bad", "People should get paid for the use of copyrighted works", Freedom of Speech, private property existing, the modern concept of a fair trial... every one of those is a value judgement.
I'm sorry to break it to you, but you're really not. Commercial surveillance is exactly this imposition of value judgements onto people without their consent.
I myself am a libertarian. A government is merely a large corporation that is impractical to opt out of. Conversely, corporations that are impractical to opt out of constitute de facto government. Data protection laws like the GDPR attempt to constrain the power of corporations so they don't rise to that level, which ultimately constrains the amount of government.
A government is much more than a large corporation that is "impractical" to opt out of. They are the sole arbiter of legitimate force, and the sole entity that can govern your behavior in places you have a right to be. Corporations are perfectly practical to opt out of, as demonstrated by the many people who do. Find me one person who doesn't pay taxes and doesn't go to jail. I know plenty of people who don't interact with FAANG at all. Corporations will never rise to the legitimate use of force. If they do, they'll be governments.
> They are the sole arbiter of legitimate force, and the sole entity that can govern your behavior in places you have a right to be
This fully depends on what you include in your definitions.
Imagine this: A company owns a vast area of land. You agree to contract with this company in order to be on their land. This contract includes things like using physical force against you if you violate other terms spelled out in the contract (just as you can contract to have violence done to you at a BDSM club). The contract defines a technical term "right", the definition of which spells out some things you're positively allowed to do, and is somewhat harder to amend but not impossible. The terms allow you to sublease a bit of their land for your exclusive use. Your sole way to terminate this contract is to completely leave the company's land and pay off any balance you owe. Call this company USG and it is indistinguishable from the United States Government.
> Find me one person who doesn't pay taxes and doesn't go to jail
Most people who have under the table income and don't report it. Same as how it's often possible to get around breach of contract when your counterparty doesn't find out. Model vs reality. And note how similar the requirements for keeping your income unreported mirror the requirements for avoiding transitive association with a given corporation.
> I know plenty of people who don't interact with FAANG at all
1. There are likely still surveillance profiles being kept on them. 2. It's hard to believe said people use the web for anything, given the prevalence of Google Tag Manager and CAPTCHAs. 3. More entrenched than FAANG are Equifax and LexisNexis, which are even harder to distance yourself from. I'd say it's easier to renounce your citizenship of most countries than it is to avoid the worst of the surveillance companies.
> Corporations will never rise to the legitimate use of force
You keep using this word legitimate, which entirely depends on perspective. I would say that it is plainly illegitimate to throw someone in a cage for smoking a plant, and so calling government inherently legitimate is a bit dubious.
> Corporations will never rise to the legitimate use of force. If they do, they'll be governments.
Corporations, as creatures of law, are an apparatus of government. And they’ve participate in the legitimate use of force from the very dawn of corporations as a thing.
If the gossip was scaled up enough you might have laws created around it (actually some jurisdictions have anti-libel laws which is basically equivalent to gossip). We don't currently attempt to legislate decency because in most cases the system self-regulates, just like it used to do with the shopkeeper & fungus cream scenario mentioned in another comment. Then technology came along and increased the possibility of information sharing (and potential harm) by orders of magnitude, and once it's been determined that the system no longer self-regulated laws such as the GDPR were drafted.
well online-mob-cancellations fuelled by social media are exactly what scaled up gossip leads to. so we are already there and it's about time that something is being done about that.
Yes, and I'm suggesting these laws are unjust because they impose the values of some segment of people on to others. Leave people to associate as they see fit. Some groups will track you, some won't. You can choose which to patronize, on which browsers.
Literally all laws “impose the values of some segment of people on to others”, so according to your criterion all laws are unjust. I guess that’s a form of anarchism, and you’re welcome to it. For me, it’s simply not an interesting way to think about justice and how society should be ordered, because it seems to be the end of a conversation, rather than the beginning.
Laws against theft are just because violence is a legitimate form of recourse against thieves. In order to surrender this right to violence, people need an alternative recourse. This furthers my point - since what we're talking about is basically gossip at scale, would you argue that violence is a legitimate form of recourse?
Why is violence is a legitimate form of recourse against thieves? Perhaps violence is so inherently evil that it is better to part with one’s property than to protect it with force.
No, you don’t have that right. If I buy toe fungus cream at your store you absolutely may not tell other shopkeepers about that. I can’t even get my head around a mentality that considers this behavior to be moral or even normal.
The laws of decency, respected by all decent people. Sure, keep a ledger of what I buy from you. But do not tell anyone else anything about me. I respect this principle in non-commercial life, and so do my friends. You are not exempt from the rules governing decency just because you sell things.
But on the other hand, if you had an interaction with the shopkeeper then wouldn't the same laws of decency prevent you from telling others about that encounter?
There are plenty of deals that you make with vendors/merchants that are given in confidence; a merchant may give you 50% off if you promise not to spread it around. Other than those occasions, "shopkeepers" not only want you to spread around that you shopped there, they actually depend on it and will often pay you to do it.
As a customer, I have no interest in you advertising for me, so it's not comparable. Any information that you spread around about me is likely to give other "shopkeepers" greater leverage over me (which is why vendors don't do this in the real world unless they have some sort of financial relationship with each other.)
It depends what I tell. If the shopkeeper tells me that he recommends a particular brand of toe fungus cream because it works for him, I will never reveal his condition to others. But I might mention his hours of operation, or that he runs a good store, because that’s public information.
There’s s difference between being impolite and being indecent. We have peeping-Tom laws because it's beyond impolite to press your face against someone’s window and peer through the gap in her drapes. Are such laws tyranny? Most think not, because they are there to protect potential victims.
If someone can stand on public property and press their face against your window, they should have the perfect right to do so, for as long as they see fit.
When I first became aware of these laws I thought as you do. But after a while I came to think the laws are OK, maybe even necessary. You say “draw your curtains”, but in my example the curtains are already drawn. Unless you seal them with duct tape, there are almost always tiny holes or gaps through which someone can see, if they're allowed to do things, like pressing their face against the window, that have no legitimate purpose aside from the intention to invade someone’s privacy. Our laws routinely consider the intention and purpose behind the act.
Before we didn't have these laws because this wasn't a problem in practice - nobody was calling other businesses at scale to tell them who was buying fungus cream. If they were, we would've had a law equivalent to the GDPR to prevent that.
Actually, there was one incident in the US where a politician's video rental history was disclosed against his wishes and as a result a law was drafted to prevent this practice in the future: https://en.wikipedia.org/wiki/Video_Privacy_Protection_Act
> Isn't that information yours, to do with as you please?
I could equally ask "Isn't that information John's, to do with as he pleases?".
Property is a legal fiction, and "intellectual property" doubly so; but it seems perfectly reasonable that society should decide that information about a human person should be controlled by them, rather than by the artificial person of a corporation.
If John asks you to make his drink preference available to every competitor of yours, and you for some reason agree to provide that service to him, then of course John can get the benefit of that information sharing, but this should only happen if he specifically opts in to it.
(In reality you wouldn't provide this service helping your competitors, and the information sharing could be managed by another service, which John would probably have to pay for, and service providers would compete based on the security, speed, ease of use, and accuracy of their service).
"information" itself is a fiction. The only question is to what degree can someone else control what you do or say? The "information" is "mine" in the sense that I should be able to use my brain and mouth as I see fit. Anything else is awfully invasive don't you think?
> The only question is to what degree can someone else control what you do or say?
Someone else can't control what you do or say, but they can establish consequence for what you do or say. Nobody can control whether you assault people or not, but they can certainly put you in jail for it.
> Anything else is awfully invasive don't you think?
I don't think so. Information that you got for the specific purpose of providing a service shouldn't be yours to spread as you see fit. In fact, there have been strict legal procedures in place for a long time for certain professions (e.g. medical sector) to enforce this.
Just like I don't expect my doctor to spread information about my hemmorhoids, I don't expect my bartender to spread information about my drinking habits.
Establishing legal consequences is our only mechanism of legitimate control. If you can be threatened with fines, and with arrest for non-compliance with the fines, then you are being controlled. Tyranny of the majority is a thing. This is why laws are an extremely blunt instrument. If people were truly interested privacy, they would simply boycott businesses that violate it. If they don't, there clearly isn't enough interest to possibly justify the use of force.
If you work for a corporation, you already are prevented from using your brain and mouth as you see fit (or at least, what you "see fit" is being heavily influenced by your desire to stay employed by them).
Consumer privacy regulations don't control your brain and mouth, they control the incentives of companies, who are not humans, and do not have mouths or brains. Those companies then control the incentives of their employees who can choose whether or not to work for them.
I'm not prevented by anyone. No one will impose a fine on me for speaking my mind at work. They may just terminate me, which is of course their prerogative. They can terminate me anyways. Small businesses with individual owners are also burdened by this legislation.
Since always? If someone asks you a question you have a right to answer as you see fit, don't you? Shouldn't you? Even if you've entered into an agreement to keep a secret, it's only a legal contract that you'd be violating (which you have a perfect right to do) not a law.
You seem to be using “right” to mean something like the state of affairs where there is no criminal penalty. Am I mistaken? I don’t think most people understand the term in that way. So my answers would be no, you don’t have a right to answer as you see fit; no, you should not; and no, you don’t have a right to violate a contract.
Yes, I mean "right" as in action that may be taken with no criminal consequences. What do you mean by the term?
With this definition, you do in fact have a right to breach any contract as there are no criminal penalties for doing so. That's why most contracts specify what happens if they are breached under various conditions. There's a major distinction between civil and criminal law.
I mean something that it is wrong for you to do. You indeed do not have the “right” to violate a contract; that’s precisely why doing so exposes you to civil remedies.
A lot of tech regulation exists specifically because of the scale of the collection/automation that is now possible. For example surveillance cameras, license plate readers, facial recognition can all be replicated by positing a dozen cops on every intersection in the country, but many jurisdictions still have laws against them.
To answer your specific scenario, it would be pretty creepy (and possibly illegal) if shopkeepers in different towns were calling each other and discussing my specific purchasing preferences.
> Now say John travels to another town, and the proprietor of a similar establishment in that town, wanting to provide John with the best level of service, calls you to ask "Hey, what does John like?", and you tell them.
That's not how it works in the real world though. Why would a restaurant answer someone random calling them and asking what one of their regulars is typically ordering? I see no possible benefit in answering that question.
How would they even know which restaurant to call?
This only works in the online advertising world because whenever John enters a shop, restaurant or other establishment his photo is taken and sent to a central authority. That central authority can then sell space in some of those shops to push a product or service to him based on the history of shops he's entered.
> Now say John travels to another town, and the proprietor of a similar establishment in that town, wanting to provide John with the best level of service, calls you to ask "Hey, what does John like?", and you tell them.
You casually say this like it's obviously OK, and like it ever happens in the real world. If I'm doing business with you, and you investigate me to discover other people that I've done similar business with in order to ask them what I like, you're officially a creepy business.
The reason this is a matter of creepiness and not law IRL is because no business with more than a couple of customers could manage to regularly do this. The internet is what provides the dragnets, and the ability to be creepy at scale.
I hope you are playing devil's advocate and don't mean this seriously.
1. When you enter it into a computer, everything changes. The scale with which data can be stored, distributed and aggregated is just staggeringly more huge.
2. People don't like ads, except for a few weirdos. Ads are not a service in the interest of the consumer.
3. Knowing someone personally is fine, but passing on that knowledge without their consent, or at least knowing for certain that it is in their interest, is a no-no.
And that's only the three most egregious things you get wrong.
A computer is just an augmentation of a mind. It does the same things, just more perfectly. Shouldn't we all be able to augment our minds as we see fit? Should our freedoms change in the process?
People who attract new customers with ads must like them. People who were informed of a product that meets their previously-unserved needs must also, right?
3. is just plain old gossip, which is also what I'm describing. It has always been in poor taste, and should never be illegal or regulated at all. Only religions do that.
Maybe John traveled to another city so that the things he does there are not linked to what he does at home. If John wants that I share his preferences, he should let me know and the other business owner should ask him whom to contact in order to spare him the inconvenience of listing all his preferences himself.
And if that shopkeeper starts following you around everywhere you go, noting what you do, what you buy, what you read, with whom, and then shares this with whomever they want, for profit, isn't that stalking, isn't that person basically a creep? And if they did it with all of their customers? And since it's physically impossible for a single person to do that then they will build an army of robots that will do that for them, is that still OK? Because that's already a reality we live in.
I mean, yeah if it's their shop of course they can follow you around with a clipboard and sell copies to whomever wants one. That's why it's their shop. If you don't like it, you can leave. If they follow you home, yeah they're stalking you, but that's not what we're talking about is it?
Bad example.
You are not memorizing one customer but millions, you don't note one favorite purchase but all, you don't tell one other store owner but thousands.
One locust is harmless thousands are a plague.
The question to ask yourself is always 'if I prefix this with 'at scale' does that change the equation?'. And if the answer to that is a 'yes' then your online/offline analogy doesn't hold water.
The more you move from "your brain" to "database", the bigger the risk of abuse.
If you keep it in your head, it's not going to be stolen or abused. If you put it in a notebook, the risk increases, but you at least aren't going to be doing this at a large scale, simply due to physical limitations.
Various laws have different thresholds, but usually it's something like a "systematic collection of data" or "automated processing" (I think it's the latter for GDPR), which seems like a reasonable compromise to avoid hampering the low-risk small scale use cases.
Edit, looked it up, this is the definition GDPR uses:
This Regulation applies to the processing of personal data wholly or partly by automated means and to the processing other than by automated means of personal data which form part of a filing system or are intended to form part of a filing system.
So if you keep your database in a notebook, you're probably still fine, because the structure makes it impossible to do nasty things at scale. Once you switch to alphabetically sorted index cards, you've crossed the threshold.
This is exactly what seems crazy to me. Why does augmenting my capabilities with technology arbitrarily subject me to additional regulation? I should either have a right to do something, or not have that right. That's what "inalienable" means. In this case an inalienable right to free expression.
> I should either have a right to do something, or not have that right.
Should you have the right to watch which direction I'm going when I pass by at an intersection?
How about at the next intersection?
And the one after that, and the one after that, and the one by my home, and all the ones that I happen to go by when I next leave and go somewhere?
There's no single point at which passive observation turns into stalking but we still have laws against stalking and it's still perfectly ok and legal for you to watch where I'm going. If you understand why it's ok to look around you (and perhaps even take notes or draw what you see, snap a photo) but not OK to do that systematically around someone, you should also understand why we might want to restrict automated unwarranted and consentless data collection, even if taking some notes is OK.
The other thing is scale. Laws against seemingly minor things are enacted when that thing becomes widespread enough to upset many people. You probably don't upset people too much by taking some notes in a shop. If every shop had a fleet of staff dedicated to the same thing, that probably would upset people and lead us to a similar discussion.
For the same reason why you are allowed to fish with a single fishing rod in some places, but can't just drop a giant net or trawl the whole river: Scale matters, and regulating the tools is often an easier and more effective solution than trying to regulate scale directly.
In particular with data, the processing method makes a qualitative difference, between individual disconnected people each knowing one small snippet of your life, and a megacorp having an overview over whatever you're doing at every moment. By de facto limiting it to "one brain", the panopticon is prevented.
Your rights end where they start impacting other people's rights, and drawing those lines is what we have governments for. Even if you were to use just your memory, if you were tracking what someone is doing in such detail as ad companies do, you'd likely a) not be able to do that to more than a few people b) receive restraining orders for stalking.
Augmented capabilities are different capabilities. Your inalienable right to free expression only exists for certain definitions of expression. It does not extend to incitement to crime, revealing classified information, deceptive commercial speech, misleading investors in a public corporation, etc. You have (in my opinion!) an inalienable right to have a pistol, but not to have a hydrogen bomb.
No, you should not. We draw the lines wherever we think we want them. If you think the line is in the wrong place, you bribe a legislator to move it, or vote, or something like that.
A recognition that they're not going to be realistically able to stop the small business owner keeping customer preferences in a notebook and the cost/benefit of chasing after them is low should not preclude taking action when it gets scaled up to monitoring way more people in a way more pervasive fashion.
We've had laws governing public spaces and common rights of way for millennia. Governing what and to whom people can communicate, and what they are allowed to remember or record, that's the new thing I'm objecting to.
GDPR doesn't care if its in a database or in your notebook, you are still subject to the same laws and as such the data in this case is the property of John. You need his permission to record it.
>Now say John travels to another town, and the proprietor of a similar establishment in that town, wanting to provide John with the best level of service, calls you to ask "Hey, what does John like?", and you tell them.
That's unacceptable to start with.
>What has changed exactly?
You've automatized something unacceptable ?
>Don't you have a right to record who transacts with you?
Within a certain context, you do. The data that is untrusted to you is done so based on the assumption that you're acting in good faith and won't trade that information without consent.
>Isn't that information yours, to do with as you please?
Absolutely not, at least not in the legal systems we have in Europe.
>Can "John" command us to forget we ever saw him?
Yes, he can ! At least in my opinion and virtually every other European's he does. The right to be forgotten is an active subject of discussion[0]. The stance in the US is that it runs contrary to freedom of expression. The stance in Europe is that personal freedom implies being sovereign over one's own data. In technical areas, the right to be forgotten is interpreted as the right to erasure[1], which happens to be part of the GDPR. I've myself used that right several times. And I end up being very mindful of my usage of data in software I write.
>How is any of this reasonable?
Your ease of business doesn't trump someone's rights over themselves, the data they generate being a extension of it. End of story.
On what grounds exactly? When interacting with people they are able to observe you and record their observations. From where do you derive a "right" to control their behavior in this regard?
How could you classify an observation about yourself made by someone else as "yours"? How would you enforce this "right to be forgotten" when people carry around storage mediums made from meat? Or are you simply suggesting that you should be forced to go through all your letters, diaries, notebooks and ledgers on the whim of someone's demands?
Technology is making this easy to scale up, there will always be people who want to abuse this, and the abuse potential grows with scale.
What was once 'I'm particularly fond of a certain shops version of a food item' can then become every shop selling that category of item automatically guiding you to the same type of item. Besides the part where everyone knows your name and a quick dossier like you're a celebrity but without any of the perks, life would be so drab if it was "The usual today John?" at literally every place everywhere always.
I'm more or less in agreement. What we need is a change of mentality/awareness, i.e. that everything you do online is being tracked and shared with other parties. Because it mostly is.
These is no difference offline vs online in this context.
You almost certainly had a legitimate interest in remembering your guest’s preferences, but retaining only the necessary amount of personal information is key, and sharing it with other firms is more restricted- an example of where this could be legal is in sharing warning of violent guests.
The problem is the scale at which this is done. Technology allows to not only collect much more of this data but puts that data at risk of being stolen much more than a single physical notebook.
If you employed an army of people to be able to take photographs and remember or write down what every customer looks like, what time they come in, what they typically wear, how long they spend looking at each product, etc... most people will find that creepy and will take offense at that.
They may find it creepy, but why should it be illegal or regulated by the government in any way? If you don't like the creepy establishments, go elsewhere?
And yet I see every single supermarket offer their own membership card "to get discounts" and everybody is happy with it. The only purpose of that card is precisely to track your purchases.
It seems to me that some people in society decided that websites aren't allowed to sell ads based on what you view, but all the other tracking in our society is just a-okay. I've not seen a single campaign or push against predatory membership cards or credit card info being sold.
Depends if you're in the EU but I guess you could've lobbied against the GDPR when it was being drafted. You could also lobby against restaurant food safety regulations, or discrimination laws. The reason these laws are there and stick around is because a majority decided that these behaviors were noxious and should be outlawed and the current majority appears to be happy enough with the current situation to not demand laws to be changed.
> every single supermarket offer their own membership card "to get discounts" and everybody is happy with it
It is opt-in (you can decide to not swipe it when buying the aforementioned fungus cream if you don't want it associated with you), the data collection is relatively common knowledge and is disclosed when you sign up for the card (and if it isn't then that's a breach of the GDPR and should be rectified).
In comparison, online data collection is at best opt-out and at worst mandatory and often invisible (and even if you could see what data is collected from your browser, you have no visibility on what further processing is done on it or to whom it gets transferred or sold).
> but all the other tracking in our society is just a-okay
Source?
> I've not seen a single campaign or push against predatory membership cards
Those are opt-in.
> or credit card info being sold.
Every time the selling of credit card info comes up on HN people speak out against it just like they do against ad tracking, and the only reason nobody else talks about it is because they most likely don't know (would a reasonable person expect their bank to be sharing their purchase info with third-parties?).
Both of these issues are addressed by the GDPR by the way; it covers much more than just ad tracking on the web.
But GDPR bars websites from doing what the stores are doing. The website can't refuse to serve you the website if you don't agree to the tracking, but membership cards work exactly like that. You only get the membership discount if you agree to the tracking. Websites aren't allowed to do that.
The store doesn't bar you from entering & shopping without a membership card. If they did, it could very well be that the GDPR would equally apply and forbid them from mandatory data collection as a prerequisite for shopping there.
The opt-in of adtech is visiting the website and the reward is the content on the website. You're the one that starts the chain of events in both cases.
>The opt-in of adtech is visiting the website and the reward is the content on the website.
No, quite clearly I've opted-in to the site's content. No body has ever knowingly navigated to a website to enjoy its advertising and tracking. (except for maybe 3-5 individuals)
>You're the one that starts the chain of events in both cases.
This is victim blaming. I started the chain of events leading to the rendering of the website content, not the ads or tracking behavior to which I would otherwise be oblivious. This is akin to saying I asked for a computer virus by purchasing this computer.
What if everyone does it and so you literally don't have anywhere else to go?
What if the data collection was done in such a way that most non-technical people aren't even aware what data is being collected and how it is used? In my hypothetical example about a business employing an army of workers to follow, photograph & take notes about every customer the behavior would at least be visible by the customers (and so they could choose to go elsewhere), which is not the case with modern technology - data is being collected silently in the background.
> why should it be illegal or regulated by the government in any way
There are plenty of other unlawful things you could apply this question to. Society enacts laws to dissuade & punish behaviors that the majority finds reprehensible.
That assumes that the customer is aware that it's happening and that there are other stores that don't do it. Since at least one of those things is vanishingly unlikely, we have legislation.
Honestly, more and more I'm starting to think GDPR is just an excuse to fleece "evil foreign tech giants". It's a set of arbitrary rules with vague and selective enforcement that seems not to be completely understood even by the legislators who wrote it, as demonstrated by legislators not knowing the answer to the simple question: Are pop-up consent forms acceptable. It's whatever the bureaucrats don't like that day really.
Guess some US company will have to go to court (and subsidize the European Legal Industry doing so) for the privilege of figuring it out.
It wouldn't be so comical if FAANG wasn't full of European devs who chose to innovate in the Valley, probably to escape this very bureaucracy.
Doesn't seem arbitrary to me. It might be cultural though. Somehow the need for privacy is just not something that is in the lived experience of eg many Americans, apparently.
Say we've learned over time that the color purple is really dangerous and needs to be regulated (work with me here for a second) .
Now try explaining "Purple Considered Harmful" to someone who is colorblind. The fact that suddenly grape juice is bad, but orange juice is good... weird. Or the fact that you'd not be allowed to mix tomato ketchup with blueberry jam. And you'd just randomly get fined for trying to sell that "dark green" sweater. Madness!
It would all seem quite arbitrary to the color blind person. Meanwhile someone with color vision would instantly see that all those things are purple, and thus clearly harmful.
The example might seem bit contrived, but something similar might be going on with the European concept of privacy.
People have been hurt due to PII issues in recent history, so one would like to keep control of PII. But try to explain that tingling spidey sense that has kept one's family alive in the last century to someone who has never encountered Nazis or Soviets before, and has instead lived a life of unfettered freedom. That might be rather difficult. Especially if you can't get the base concept across, and need to do it in terms of (very arbitrary seeming) examples.
Honestly, if you fail to comply with GDPR, you probably failed to comply with laws stretching at least back to 1995. GDPR mostly changed ability to hunt down violations, and further harmonised and clarified laws that were already in force. It's easier to comply in post-GDPR world than before it, and there's general guidance for let's call it "honest mistakes" as well as coverage on some internet-specific issues.
And Valley is pretty much there because when the long chain of capital-hungry events was building it up, we were still rebuilding from last war while funding preparation for take three thanks to Truman.
The fact that it took so long clearly demonstrates the incompetence (or potential conflicts of interest?) of the regulators, but better late than never. I guess they could no longer maintain their charade for any longer under the pressure of various pro-privacy organizations.
Note that even then this seems to be just a ruling and actual consequences are still dependent on individual regulators. Given their prior lack of action it will presumably take years before we see any fines resulting from this.
In my layperson's knowledge of GDPR, these awful consent popups always seemed completely illegal:
1. They prevent access without a lengthy/arduous process. Certainly in violation of the spirit of the legislation and almost certainly also the letter.
2. This was of course entirely intentional, in order to annoy users into clicking yes and laying blame on the GDPR
"The GDPR made us annoy you". It doesn't.
3. They often do not allow a single click deny, you have to go through sometimes dozens of vendors and deny them one-by-one. This is so obviously illegal it isn't even funny.
4. What's worse, if they do have a "Deny all" button, it's almost certainly there to trick you.
Because they have essentially the same list of trackers duplicated under the "legitimate interest" category. Which "Deny all" won't catch. You have to "object" to the legitimate interest. So if you hit "Deny all", you will instead be tracked by all.
This is so brazen it's almost breathtaking.
Anyway, good to see progress on this front. The ad-industry is still in deep denial about GDPR, thinking that they can continue their business model in the face of it. They can't. Their business model is illegal, and has been since GDPR came into force.
The conflict has been brewing for some time now, weaving its way up through the channels.
Yeah, I still don't quite get the argument against do not track, and why its clear declaration of intent couldn't be made binding. I mean, you're effectively telling web sites "Do not track me", and they are responding with "Hi, we'd like to track you - please spend ten minutes working through our dark patterns if you're not OK with that".
Unfortunately, DNT is not a clear declaration of intent, because privacy evangelists view it as their moral duty to make that decision for everyone.
It's an inconvenient fact that - perhaps a decade ago - we had DNT, and advertisers were starting to respect it, but then browser makers decided to default it to on, making it pointless.
You're very close, but I think "browser makers" makes it sound like it was more than one. Microsoft Internet Explorer defaulted it to on. Every other browser was in agreement that it would only get advertising industry buy-in if it was defaulted to off.
I think Microsoft's default-on stance was likely intentional sabotage - Google operates a big ad network and would have to deal with a lot of the fall-out.
Why shouldn't the default be to not be tracked, and only start being tracked if you explicitly want to? Advertisers always frame this conflict as though it's absurd to expect them to just stay out of our lives, and anything that makes it easy or default to avoid them should be rejected as impossible.
The feasible choices are between (a) DNT, off by default, that the more responsible and regulated side of the ad industry respects or (b) DNT, on by default, that everyone ignores.
Which one is the greater good?
In other words, you're welcome to walk up to me, slap me in the face, and call me a son-of-a-bitch... but that's probably not a great start to a conversation that ends with "Would you please work with me on this?"
> In other words, you're welcome to walk up to me, slap me in the face, and call me a son-of-a-bitch... but that's probably not a great start to a conversation that ends with "Would you please work with me on this?"
Wow, that's quite some re-framing going on there, if you're casting yourself as the advertiser in that sentence. It's more like if you were regularly "borrowing" my car without permission. Do not track is a bit like the lock on the car, which obviously you can get around in 30 seconds flat. Everyone has a lock on their car, right, and it's installed by default, so by the advertisement industry's reasoning it isn't a true indication of whether I want you "borrowing" it. If you were regularly "borrowing" my car without permission, it'd be reasonable for me to walk up to you, slap you in the face, and call you a SOB, but I'd me more likely to just call the cops.
Data about me is owned by me, and other entities can process it only in strongly limited circumstances. That's the default position, by the standard of basic decency, and also in law. Stop making it sound like me demanding control over my own possessions is unreasonable.
Yes, that is what you want. But you know there are other people in the world? Who want different things? Some of which may be the opposite of what you want? And that therefore, in the interest of civilization, we find consensus between what everyone wants?
Not pointless. We know that most people are not okay with tracking (the opt out on iPhones are 90+%), so the right setting is to be one by default.
However while the ad industry might be okay with a few nerds opting out they weren't okay with most of the general public opting out and so they spread stories like the one you repeated.
To get to 90%+, Apple had to present their users with a forced choice. The majority of users might prefer not to be tracked if they're put on the spot and required to give an answer, but how many would actually go to the trouble of changing a default?
> Unfortunately, DNT is not a clear declaration of intent
Often it is. Firefox, Brave and Safari explicitly advertise themselves as privacy-friendly browsers.
That leaves non-savvy users who just use whatever defaults exists, but there is an even stronger argument to protect precisely those people - you can't consent to something you don't yet understand.
Tracking can leak extremely sensitive information, just like microphone, screen sharing and webcam permissions could. Protecting the user is a sane default in all of these cases. The fact that personal data has commercial value is secondary, just like it would be with webcam access.
Imo, the only meaningful difference between tracking and camera access is that fully-fledged tracking was an accidental side effect of third-party cookies, and before "we" understood the implications of that a trillion dollar industry was established. The reason we're apathetic to tracking is because it's abstract and novel, whereas snooping to your audio or video is easier to grok.
Servers have access to both the DNT header and the browser id. Advertisers could have argued that the DNT header sent by some browsers was not an informed decision and likely forced those browsers back to explicit opt-in/opt-out by users. But they did not. They used the first escape hatch they found to ignore the header completely, kicking the can further down the road. Of course they did, because every body knows how many people will opt-in to tracking without being bribed.
Defaulting to no tracking is the correct default for advertisers that are respecting GDPR. If someone wants to be tracked, they can opt in by turning it off.
They obviously wouldn't want to just comply with DNT (or any other easy way to opt-out) as they'd be signing their own death certificate.
Instead they exploited the apathy & incompetence of the regulators with their so-called "consent" flow. Considering the GDPR was supposed to be enforced since 2018 and they've made it to 2021 without any consequences I'd say that strategy paid off.
I have also see some sites that are also using a pattern where you either accept their tracking of you or you can't use the site at all, they just block the content or send you to a useless site. That isn't legal either, consent has to be something people actively give and not giving it can't be a reason to reject service. Quite a lot of gyms are getting this wrong in regards fignerprints, they don't get to force that mechanism on you and deny you access if you wont provide it.
When the legislation first came in I reported about 100 websites that were breaking the law in obvious ways, they are still like that and the ICO hasn't even responded to those complaints.
This has been my experience as well - the complaints take lots of time to write and manage (you have to first complain to the company and give them 30 days to respond, etc) and in the end the ICO was completely useless anyway.
The rules are actually very simple, you have to obtain a clear, explicit consent given out of a user's own free will to track him, or you're breaking the law. Don't like it? Tough luck.
If I do a blocking banner that said, we track users on this site to pay for it with ads, click OK or leave - under GDPR there are lots of gotcha's that make this potentially insufficient. I think folks claiming GDPR is "simple" do not understand it - and how complicated it is.
We have this for age restricted sites already - if you are under 18 leave, if over 18 continue, content is NSFW.
Reddit does the same thing for adult subreddits, you get a popup, this content may be NSFW, if you are at work and will be fire, leave, if not click through if you want - you've been warned.
This model however may NOT be permitted under GDPR which has a TON more requirements on tracking consent, recordkeeping for consent, versioning of disclosures, tracking versioned disclosures to consent identifiers tied to other identifiers etc. If you violate any of this you are breaking the law, are committing crimes, and may need to pay $20M or a % of turnover whichever is GREATER!
There are multiple ways to satisfy the requirements, but that's hardly Kafkaesque. It's simply convenient for the ad business to pretend the rules are incomprehensible, because they'd really rather not understand them.
I'm sure there are real problems with the GDPR (e.g. perhaps how and particularly where it's enforced, and how it favors large business over small, and that there aren't enough practical exemptions for small-scale data collection), but the fact that there's no reasonable and clearly legal loophole for the ads/tracker-business isn't one of them. That's not Kafkaesque, that's by design.
"They often do not allow a single click deny, you have to go through sometimes dozens of vendors and deny them one-by-one. This is so obviously illegal it isn't even funny."
The site sets two cookies on landing regardless of any clicks anywhere.
Edited because I can't reply:
There are lots of lies being told on this discussion. The EU websites track you even if you don't hit accept. It's a 13 month cookie.
"
When opening a page where Europa Analytics is enabled, the browsing experience is registered by the service.
If you refuse cookies, you will also stop the Europa Analytics service. If you choose, though, to contribute your browsing experience on our websites as part of the anonymous statistics, you will enable us to significantly improve the performance of our communication, its outreach and its cost-efficiency."
Before accepting any cookies I got a _pk_id cookie expiring in 13 months.
They are clear this is what will happen.
Just check it for yourself before you listen to the lies / blather you read here.
The EU's own websites track you on first landing.
Note - I have been following this. They used to do a blocking cookie pop-up. This actually had nothing set on pop-up, but blocked you from using their websites until you gave consent or denied it.
The problem was, these required cookie popups are so annoying that many folks have (perhaps illegally) moved to the EU's new model, where they stick it at the bottom, they set the cookies, and if you just use the website you get them.
"Consent must be freely given, specific, informed and unambiguous."
So a question remains, if you give someone the option to decline to be tracked, is that enough? Or do you need actual consent?
The EU website is doing the tracking with option not to be. Other experts say you really should have consent first before doing any tracking.
Anyways, not giving my opinion on which is right, just that there are different views, and even EU does it in ways I think that folks here do not understand.
The one thing, the EU sites are extremely CLEAR about things, I do like that.
It deeply saddens me that for all of the greatness humanity is capable of we're still dealing with pop-ups and "cookies" when the solution is obvious and should have been in place years ago (and the current situation has ruined the modern web because "senator we sell ads")
All you need is to build this in to devices sold in EU - iOS, Android, Windows...Each give you privacy controls at the OS layer that applications must respect, on the browser level, this may be a "reject tracking and cookies". Boom. Done. All EU websites will be required to check for this API and their JS code must be plain to see for any visitor using the "view source" option. Going forward, we can build privacy controls at the technical layer, so regardless of the'stack'/'layer' software and hardware is built with GDPR in mind. We are still a long, long way away from that reality and truthfully, we will likely not be there for many decades.
Sadly, it seems this "cookie" debacle is one that is more society based than technical, and it's obvious cookies should probably be replaced by now with better solutions.
Maybe the GDPR might finally yield some positive changes but I remain doubtful. The industries it wants to disrupt have powerful lobbyists (hence why most right to repair legislation doesn't dare challenge Apple, for example).
The point of the cookie warning was not to give users the option to disable cookies (although giving the option to users that are not familiar with their UA is also a nice side effect).
The point was to force websites using cookies for "dubious" tracking purposes to be forced to show the banner as a mark of shame so that users would naturally migrate to websites not spying on their users and therefore not showing such banners.
Obviously, this universe being the dystopia that it is, every website started showing these banners overnight and users started ignoring them anyway.
If you just enforce all browsers to ignore cookies period, then you have another X-Do-Not-Track-Me scenario (or whatever it was called), where everyone just sets this flag and therefore tracking continues, just using other methods.
I guess the misconception about GDPR and cookies is still around. Presumably it's due to the earlier ePrivacy Directive (aka "cookie law") which I agree is completely stupid, but GDPR covers more than just cookies.
The GDPR mandates that data subjects provide informed consent before you are able to collect and/or process their personal data for non-essential purposes (ads & analytics don't count).
The technical means you use doesn't matter. It can be cookies, but it can also be browser fingerprinting or IP addresses (which you can't deny as the remote server needs to know your IP to communicate with you), or it can even be information you manually enter (such as name & address for payment processing).
A purely technical solution will only cover the black & white case of "provide the data or not", it will not cover more nuanced cases where you need to provide the data for essential purposes (the IP so you can load the website, personal details for payment processing) but do not wish this same data to be used for other, non-essential purposes. A legal solution here is needed and that's what the GDPR is about.
Cookies are a tiny part of what the GDPR is about (and it isn't even the main regulation that covers them). It's what people associate with it because adtech companies have made cookie banners deliberately obnoxious for that very reason. The vast majority of things covered by the GDPR are about the day-to-day protection of everybody's personal data, whether that's medical records, CCTV, employment records, insurance etc. Most of the provisions in it were already the law in most of Europe: it just standardised the rules and enforcement procedures. The main problem with the GDPR is the lack of enforcement, which is largely down to the lack of resources in the national privacy regulators in member states.
Not sure it applies to this one but I regularly notice is dark patterns and default options which are opt-out (it has to be fully opt-in, even with defaults). Of course that is in breach of GDPR.
Aliens (and any other sentient species intelligent enough) would cringe at how our species is expending insane resources to essentially waste our peers' time (by showing them ads) and trick them into buying things they don't actually need all while destroying our planet.
Whatever the GDPR was supposed to accomplish, wouldn’t it have been better to simply criminalize whatever the law imagines people should have to give “consent” to?
One thing GDPR v2 should mandate: these popups MUST have a prominent button to only enable the "necessary" cookies. Better yet, create a site which would state my preference globally and which they'd have to obey. Without this, companies outdo themselves with deceptive design to make you enable more cookies than you'd like, and it's fucking annoying.
Most users don't care about a session cookie or whatever. In fact for some sites / SPAs etc where users are behind NATs are CGNATs they are pretty needed / useful. Or if the website is being hosted behind a load balancer, you can do a sticky session with the cookie, and many more uses.
Session cookies (and other cookie types that are actually necessary for proper site functioning) do not require, and never have required, consent. It's all of the other crap that does.
Again, this illustrates are badly understood and hard GDPR is.
Browser preferences are opt-out currently. ie, users has to go set some flag. The GDPR requires informed affirmative consent.
"Consent must be freely given, specific, informed and unambiguous."
So having a website that track you unless you set some browser flag is not enough. Many folks say that you need specific consent to track. That is why there are so many pop-ups on EU websites.
Anonymized matomo tracking strips parts of ip address - it is not so precise but it does not track you across those sites. So they have some kind of page counter but it is not same as full analytics.
Reality is EU website sets a 13 month cookie, they clearly explain they will track a lot of data about you.
Anyways, some folks here claim that just using things for analytics is NOT an allowed exception to GDPR notice rules. I mention this to just again show that GDRP is not simple (despite claims here that it is "so easy").
Good luck making money as a shitty restaurant that doesn't care about food safety and hygiene. But who cares about restaurant owners. Certainly not the EU.
Snark aside, if you can't make money without stalking users, maybe you shouldn't be in business.
Is GDPR coming to restaurants somehow? Oh god! This really is getting worse and worse. Is this for things like remembering your favorite orders etc? I could see some GDPR arguments there. What if a waitress just remembers your orders in their head - will they need a consent form? If they put it in their CRM / sales system?
The issue folks have is many users if given a choice, will take the free service (instagram / tiktok / free gmail) in return for being tracked if they are given a choice.
Feel free to start up the business that doesn't do this (protonmail etc). But MAJOR services are built the other direction (ie, billions of users worldwide).
Yes, this would be covered by the GDPR and for good reason. If I join tonight's waiting list and provide a phone number in case they suddenly have a table become available I do not want that phone number to be reused for marketing spam down the line.
> The issue folks have is many users if given a choice, will take the free service [...] in return for being tracked if they are given a choice.
So clearly, when given the choice, most people would rather not be tracked. The problem the GDPR is trying to address is that people are not given the choice.
> But MAJOR services are built the other direction
There were plenty of businesses in the past that were built on basics that we now deem harmful. Back in the early 20th century, it was legal to sell radioactive water and market it as a miracle cure: https://en.wikipedia.org/wiki/Radithor
Society has since determined this is harmful and outlawed that. The same thing is currently happening with environmental pollution, and the GDPR is trying to do the same for noxious business models on the web.
Are there privacy organizations that can stop themselves from putting out unhinged, frothing-at-the-mouth press releases like these? I'd like to be aligned with some privacy interests but I absolutely no intention of associating with anyone who would write this.
I'm slightly surprised by your comment. The release seems pretty factual to me? I too am sensitive to overwrought tech organisation prose (privacy/piracy/open source groups/basically anything to do with Assange) but this seemed ok. What about it strikes you as particularly unhinged?
The main word I can find that seems like it might be regarded as over-emotive is "plagued". Is it that kind of thing? [ OTOH, bad GDPR popups are pretty much a scourge... ]
edit: oh I guess the stuff about advertising firms depriving people of their "fundamental rights" is yeah a bit over-wrought...(though privacy is important, at least to me, and I think it's ok for a civil liberties organisation to care a lot about it).
Check your cookies. Tracking cookies have been set before consent.
Many on HN are ranting about how something like this is illegal or that the GDPR is easy. The answer is that the GDPR is NOT easy. That some would say that setting tracking cookies on landing without consent is not legal, others that it is, and the EU is all over on enforcement.
If you browse an EU website, they track you without any explicit consent.
I prefer this approach personally, all their websites used to have a modal pop-up, you could not move forward until you consented. Between phone, desktop etc etc, SO annoying. I'm sure it turned a lot of folks off GDPR, so they cheat with this to try and avoid annoying people (as many others do).
>Nothing has made it seem like anyone actually looks at any of this themselves. I've been following this a while now.
This is you, mate. Look through your comments here; almost all are aggressive, confrontational, and toxic. Some of your comments make assertions with no supporting information and are subsequently dispelled. I can't tell if you're just trolling or are otherwise oblivious to your behavior. If you demand better discourse, practice it.
"Incognito" mode (or whatever one's browser may call it) is irrelevant in this scenario. This mode of browsing simply eliminates a subset of browsing data upon exit. Such a mode is useful for quickly navigating to some website you don't want showing up in your history or maintaining a session and that's about it.
Sorry for not being clearer - I will try to spell things out better.
"Incognito" mode (or whatever one's browser may call it) is irrelevant in this scenario."
No it is not. When looking at what a website does, if you browse in your regular session you may be bringing over a very large cookie store, visit history, prior acceptance of cookie popups. The reason I suggest using a new incognito session is because you start these sessions fresh from a cookie perspective, and the site you are visiting should see you as a new user and re-prompt for cookie acceptance etc. Then you can easily see what they are doing.
It's really actually interesting being lectured and yelled at by "experts" here. I think the GDPR is sort of a bandwagon thing at this point. A fair number of relatively uninformed folks on the issues / technical side jumping on and making a set of fairly strong and often uninformed claims?
You literally have folks saying GDPR is not complicated, and literally on the same threads making plenty of contradicotry claims (ie, totally illegal to not have a deny all button, totally banned to track before acceptance, ok to track before acceptance (default opt-in vs opt-out) etc.
The folks claiming GDPR is easy don't realize how big a consulting industry has sprung up to try and help folks trying to get it right. I mean, tracking rules for the cookie acceptance cookies - is like perfectly up these consultants thing because they make so much money off all this.
TLDR: GDPR covers more than just cookies and restricting cookies does next to nothing when it comes to tracking (in fact modern browsers already do restrict cookies by default, which is a pain to deal with as it does break legitimate usage such as cross-domain SSO).
The issue though here is around consent pop-ups being found to violate the GDPR. I'm just pointing out that the pop-ups continue to be very complicated for folks to deal with. My understanding is that the cookie set by the consent pop-ups may now be classified as personal data, requiring a data controller, permissions etc. Or am I misunderstanding this latest twist? Shouldn't be hard to fix, but good lord if you are a smaller player.
Just note, politically somehow cookies have become the big privacy boogeyman. Meanwhile personal data is harvested at HUGE scale (cable TV / smart TV/ ISPs etc) at least in the US without consequence. Ripoffs online are insane in quantity without much consequence. Major issues like DoS attacks go unaddressed and more.
And because practically while the EU says ads and analytics are not essential, since for free websites they are in reality pretty essential, there is a lot of natural tension there.
I am bummed about SSO issues now and modal cookie popups particularly on mobile etc.
The other issue privacy folks are missing I think is focusing on this endless consent and reconsent on every website. I've seen data that 95%+ of folks click accept all. So you are delivering a product feature that annoys 95% of your target, it's a losing game.
My own expectation, in long run this compliance overhead will help the mega platforms win out. They are the only ones (youtube / google / microsoft / apple etc) with the scale to really manage the years long investigations, million page doc requests etc. Ie, if you are going to put a video online, you are going to need to put it on youtube if you want anything analytic related to it.
The other side is, vast folks ignore the law and just hope they won't be caught, or do a random (I accept) cookie popup and hope they are covered.
> The issue though here is around consent pop-ups being found to violate the GDPR.
Pop ups doesn't violate GDPR.
Making a system, pop up or otherwise, that is significantly harder to opt out of than opting into and then going on to pretending that users deliberately chose tracking however, that is a violation.
The rules are simple: tracking is opt in. You have to convince users there's something in it for them.
And, as someone who gave especially Google the benefit of doubt for the longest time I can testify that for me at least it did not result in relevant ads at all.
Relevat ads would be:
- local shops (I got two ads for local shops in a decade)
- family cars (I bought three used ones during that decade, not a single Google ad that I can remember)
- programming conferences
- power tools
- programming tools (one for Jetbrains tools five years ago, also a few for WordPress hosting after I searched for it.)
Consent pop-ups as a whole haven't been found to violate the GDPR. Specific patterns that certain consent pop-ups implement were determined to be in violation.
> My understanding is that the cookie set by the consent pop-ups may now be classified as personal data
I have re-read the press release and I don't see that anywhere. A cookie saying consent=true is absolutely not personal data. The problem is that personal data was being collected without proper consent (the pop-up that was supposed to obtain the consent does not comply with the regulation).
> politically somehow cookies have become the big privacy boogeyman
This is partly a holdover from the previous ePrivacy Directive which very much focused on cookies. Consent management solutions (both compliant and non-compliant) were designed to address that. These consent management solutions are now being repurposed to comply with the GDPR and some may not have adjusted their wording and still incorrectly focus on cookies even though the GDPR covers data collection & processing regardless of the technical means of doing so (so it's no longer specific to cookies - you can perfectly breach the GDPR without setting a single cookie).
> Meanwhile personal data is harvested at HUGE scale (cable TV / smart TV/ ISPs etc)
This is something that the GDPR addresses. Unfortunately enforcement has been severely lacking.
> the EU says ads and analytics are not essential
Ads aren't technically forbidden. Non-consensual data collection for targeting is. You are still allowed to serve ads as long as they aren't targeted based on the user's personal data (you can still target based on the currently viewed page's content for example).
Analytics are frankly not essential - you don't lose anything if only 10% of people opt-in for example. But even then, it is absolutely possible to implement analytics in a privacy-respecting way without relying on any personal data (and thus not require consent); for example a single hit counter that just increments an integer every time a button is clicked should not require consent. You'll only need consent if you tie that analytics event to a persistent user session.
> since for free websites they are in reality pretty essential
This is due to a lack of proper enforcement of the GDPR. At the moment the big players violate the GDPR, so any site that decides to respect it will lose out. If the GDPR was enforced and everyone was complying, the playing field will level out and either ad prices for non-targeted ads will go up (as very few people opt into ads, so this leaves a lot of inventory on the table that advertisers will suddenly want to capture) or services will start to ask for payments, normalizing non-ad-supported services.
> I am bummed about SSO issues now
That's a problem because the lack of (enforced) legislation around cookies made browsers delete them aggressively. It would've been better if legislation such as the GDPR was enacted (and enforced) sooner so that abuse of cookies could've been dealt with legally, leaving the concept of cookies itself as-is so it can still be used legitimately.
> modal cookie popups particularly on mobile
That's again due to the lack of enforcement. A proper consent flow should allow you to easily decline all cookies, or even better, just not be there in the first place because the website doesn't have to collect & process any personal data for non-essential purposes (just FYI, cookies essential to the functionality of the website don't require consent - a session cookie on login or for a shopping cart does not need consent).
> The other issue privacy folks are missing I think is focusing on this endless consent and reconsent on every website.
Again see above. Most of the annoying consent flows aren't actually compliant. The reason they're there and are annoying is to trick you into clicking accept (and to hate on the GDPR). Hopefully this ruling will force them to comply with the law which says that it should be as easy to decline as it is to accept.
> I've seen data that 95%+ of folks click accept all
See previous paragraph. That's the intention behind these non-compliant consent flows. If it's too difficult to decline then people will click accept all. On the other hand, when the flow is implemented properly such as Apple's App Tracking Transparency flow (which gives you a system-generated modal that allows you to accept or decline in one-click), the opt-in rate is in the single-digit percentages.
> in long run this compliance overhead will help the mega platforms win out
I am not sure. They are (well, were) currently winning because they are big enough to risk it, but the tide is turning with rulings such as this one. Facebook is in big trouble for example because their business model (which relies on mandatory, large-scale data collection) is at odds with the GDPR and they are trying to legalese their way out of it, unsuccessfully: https://noyb.eu/en/austrian-supreme-court-facebook-dismissed
> if you are going to put a video online, you are going to need to put it on youtube if you want anything analytic related to it
If you put a video online you can easily count the number of views by just analyzing server logs. What you can't do (and Google can't either) is track users to determine unique views for example (as that would require assigning each user a persistent ID).
So I finally asked the IAB how one could potentially implement their framework as an open-source framework. Their answer was basically that it's not possible. You have to register as a CMP provider and ensure that your users are using your software in their compliant (ha ha) way, which is of course impossible to enforce with an open-source software that everyone can self-host. In general, in my opinion the IAB is mostly a framework to shift liability from the advertisers who steal the users' data to the publishers and CMP providers. Therefore I'm quite happy we never got around to implement this "feature" in our CMP, and I hope the IAB will quickly die and takes all those alibi CMP providers down with them.