Here is a direct quote from the conclusions of the findings:
Our concern is that the practical aspects work for individuals, so we do not want to set expectations which cannot be met. Going beyond the idea of a search engine, if information is so proliferated on the internet, how would it be practical to remove all that information? As a regulator, we only want to enforce things in a way where we can achieve the end results…
We rather doubt whether the Government believe there are "ways to make this practical and workable".
Right, I don't understand why so many legislatures go to the trouble of having a bicameral system, and then make both houses elected which means that both are subject to the same pressures.
In the US, the Senate used to be elected by state legislatures, and therefore represented the interests of the state governments in checking the federal government's power. The Constitutional amendment that began the direct election of Senators has had really catastrophic consequences.
The senate before 1913 was a hotbed of cronyism, and corruption, with seats going vacant for years at a time due to there being no requirement for the states to actually put people in the senate. The seventeenth amendment was not catastrophic by any means.
Really? Life in the United States has gotten catastrophically worse since 1913 because citizens vote for senators as opposed to the stirling types of people who are politicians at the state level?
Often one house still has additional isolation. For example, in the US the Senate was designed to be isolated through the use of longer terms, higher age limits, and some other procedural differences.
While it has been eroded by increased life-expectancy and modern media, I'd still consider heredity-based membership an unacceptably-extreme solution. (Also unconstitutional, but you'd need an amendment to rework the Senate anyway.)
The House of Lords isn't generally hereditary. A small portion of its members are hereditary peers, but (and I forget the details) this will ultimately end. The majority of members are life peers, appointed for contributions to public life/politics/etc.
Right, the US's real House of Lords is the Supreme Court, which likewise is a panel of life appointees with the power to strike down laws written by the elected legislature.
(Indeed, until 2009 the House of Lords was also the UK's highest court of appeal.)
that's the opposite of common sense. There is a gulf between making information instantly and trivially accessible -- google's search business -- and it existing at all. For a long time, the relative anonymity afforded by the difficulty and cost of finding information about people worked reasonably well. Google has now upset that apple card (and we well know their attitudes, but in case you don't: their sociopathic ex-ceo wants privacy for himself but not for you.) Effective anonymity (for most, but not all) doesn't require information to be vanished; it just requires it to be expensive or difficult to discover.
edit: and ignoring the recent changes (ie pretending that it's somehow inherently right that everything linked to your name from even 20+ years ago should be instantly and freely discoverable) makes you, well, Google's tool.
Search engines were doing that 20 years ago, before Google even existed. The only thing that has changed is the amount of information that is posted on the web to be indexed — and the amount of time per week the average person spends using the web.
No, there has simultaneously been a huge increase in quality of the search engines (driven by google, to their credit), plus how much of available documents they can search.
Sure, technological advances have allowed search engines to keep pace with the expanding web. But it seems an odd leap to get from there to suddenly blaming them for discoverability* of content.
Besides, I don't think advances like PageRank (the founding technology of Google) are relevant to making old content about a named individual more discoverable.* This kind of quality improvement helps with ranking when you have 100,000 hits. But when you have a unique firstname-lastname combination with 3 hits, all that matters is the number of pages you were able to index.
*Edit: replaced "available" with "discoverable" throughout, per x0x0's critique below, since that was what I meant.
I'm not blaming them for availability, but discoverability, because availability w/o discoverability is, for most people, the same as information not existing. I'm blaming them for massively increasing discoverability, pretending that somehow this is the natural and correct state of affairs (and further, that this is so obvious it doesn't require debate), then pitching a tantrum when a government uses it's ability to control commercial behavior in their country to interfere with what google wants to be the "natural" state of affairs.
is literacy a natural state? If you consider the history of the us, probably just barely.
on to your argument, such as it is: pretending that our two choices are we can have easy access to information or not is, well, dumb: we can certainly allow individuals (as europe has demonstrated) to remove things from google while retaining access to, oh, 99.9998% (first estimate of google's index that popped up on google: 30B pages. removal requests: 70k in 2 months. 1 - 7e4/30e9) of information in the world. And that is a severe underestimate: information can still be returned, just not for particular searches as far as I understand. So starting with humane things like removing revenge porn, whether the host is or is not within us copyright jurisdiction, ad probably moving on to relatively private facts about peoples' lives is a great place to start. I'm not sure where we'll end, but I'm far more comfortable with governments -- who, after all, nominally represent the governed -- deciding than with google.
edit: for another example: particularly the HN audience which runs young, white or asian, male, glibertarian, and tech employee has little to fear from personal info leaking online: sf doesn't care about most things, ranging from lgbtq to bdsm to swinging to whatever. But there's a lot of places that do. You don't have to look far to see what would happen in, say, utah to a high school-aged transgender kid. Restricting that information online is completely reasonable.
The trouble is the data starts to take on a life of its own. Sure, it was difficult to acquire, but now it's there, in that huge collection of data, and getting rid of it is more complicated than
DROP TABLE x0x0_2003;
Honestly, is there any precedent where reasonable information purges have been required in the past, targeted or otherwise? I would be similarly hard-pressed to selectively erase information from my data, and all I have is a couple hard drives, a filing box, and a number of sales receipts buried in every corner of my life.
Common sense would propose something workable. Instead, this is proposing straw men in order to try to win the PR battle.
The "right to be forgotten" goes hand-in-hand with "paid his debt to society". It also goes hand-in-hand with "innocent until proven guilty".
The powers that caused this privacy apocalypse in order to profit don't get any support from me until they propose a solution that allows people who are accused of a crime but later exonerated to expunge that information without having to pay for armies of lawyers.
> The powers that caused this privacy apocalypse in order to profit
Wow, such loaded terminology. Those "powers" are literally nothing more than the logical and direct evolution of digital information technology. You can point fingers at some company or technologist or other, but those are incidental actors amongst a much larger sea change in how information is stored, copied, and transmitted.
Just a very few years ago, the potential ramifications of these changes weren't really apparent to anyone. We're only just now coming to grips with it and like all periods of social change, it's not really clear how to proceed. There are significant and valid questions to be grappled with: if someone has a "right to be forgotten", what other rights and concerns must be balanced against that? Who holds responsibility for this "forgetting"? Google argues that it's unjust for it to shoulder this burden. They may be right -- this problem has so far been tackled by one branch of government in one world jurisdiction. Historically speaking, I think it unlikely this one case will be a full and fair treatment of the issue.
> without having to pay for armies of lawyers.
It may be that we've built infrastructure that makes other alternatives, all factors considered (including, but not just the rights of those who've "paid their debts"), not merely inconvenient but impossible in practice. This doesn't even mean that there may not be solutions, but that this approach to modern issues of privacy is a dead end.
If only that common sense arrived with the "porn" filter. Perhaps this insight comes from the whole porn filter thing, because they learned that, too, is unworkable.
I'd like to bring up the idea of a Jubilee, an ancient idea that might be a guide for how to approach this problem again.
Basically, in the old days, people were able to fuck up their circumstances permanently because they lived in a small village where everyone ended up knowing everything.
Because that's obviously not conducive to a well-functioning society, everyone agreed to forgive and forget (debts at that time) every so many years.
This might be a reasonable way to handle our current data congestion problems. What if companies agreed to flush their user data every ? years?
edit: this isn't an explicit plan, just something to chew on. Also it doesn't address Google's issue, so it's kind of a distraction to the thread i admit.
I'm not sure you understand the issue here because on the face of it you suggest that we burn our historical record (newspapers and obliterate website) every 7 years.
The issue here isn't that Google has some information on you that leaks and if Google doesn't have this information then it won't leak.
The issue is that there is a web page, Google indexes that web page (because that's the purpose of a web search engine), someone doesn't like what that page says about him and now they have a way to ask Google to remove that page from Google's index.
It's basically decentralized censorship with frighteningly low standards for removal of information, nonexistent oversight and apparently no penalty for trying to abuse the system.
There is no solution to the "problem" that doesn't involve heavy-handed censorship.
I know what you're getting at, but I have a hard time considering it 'heavy-handed public surveillance' when the info was already able to be found via Google in the first place. It's also not like the NSA (or anybody else) is just going to magically delete all the information they have stored when it can't be found via Google anymore. It's censorship at worst and just not that effective for it's purpose at best.
That's actually a pretty clever workaround for some of the issues, but there are others that it does not address.
My main argument against the ""right" to be forgotten" has always been web forums. Think of a European version of HN. Think of a really prolific and insightful contributor to that site. They invoke this law, and EUHN is forced to kill all the posts they ever made.
Context of entire discussion threads will be obliterated.
It doesn't have to be a European HN, just HN (as accessed from the EU) would maybe be enough. It depends on whether HN is said to be "operating in Europe" or not. I don't really get how it should be enforced, but being on an American server doesn't protect you from EU (or Chinese or Brazilian etc.) law.
That's a busted myth. Just point the DNS entries for HN to the trash can, and redirect all IP traffic to the correct IP address to same trash can. You'd need a VPN.
7 years is too short a period in modern times, as even digital projects tend to benefit a lot from the old data and it's even more true for ambitious projects like rail or space.
Even an average mortgage in western societies is 20+ years so you have to remember for that long at least.
And 'forgetting' after 20 or 30 years isn't going to help, like it didn't in the case of the original Spaniard who caused the ruling.
Interesting idea. How would you define "user data"? For example would it include a Wikipedia article you wrote, a news article mentioning you, an email you received...
If you define user data in a more limited sense, I don't think it would satisfy the proponents of the right to be forgotten.
I think you misunderstand the intent. It's not about making people forget, but rather protecting second chances. At some point, a red mark on your internet accessible past may become akin to ethnicity/gender/ect. It doesn't necessarily reflect the person you are, but it's still statistically prudent for someone else to discriminate based on that information.
This is part of why we have such strict regulations on credit reporting.
I don't think things like this is the right solution to the problem, at best it is a workaround. I don't like anti-discrimination laws either BTW, but that is a different matter.
I misunderstand nothing: You don't have a right to second chances either. The world either decides to forgive you or it doesn't. It's not up to you, it's up to them. To impose such a right would be to try to force forgiveness in the minds of others, which, again, pretty clearly illustrates the problem.
You can argue that if you want. I'm personally of the opinion that people tend not to change much.
However, you're giving a lot of power to people who probably shouldn't have it. Good luck trying to get anyone to enforce libel committed via blog/forum post.
It should probably also be noted that getting a fresh start today is much harder than ever before in modern history. It will continue to get drastically worse as we perfect facial recognition. People may not have a right to second chances, but it's been the status quo for a very long time.
> To impose such a right would be to try to force forgiveness in the minds of others
Again, no. You're misunderstanding. No one is being asked (let alone forced) to forgive/forget. The poster of the original information has no inherent right to share it with everyone in the world, and the consumer of the search engine information has no inherent right to know everything that's ever been written/observed about a person.
Obviously pushing the burden onto the search engines is pretty silly, but it's similarly obvious that the problem will get worse if it's ignored.
Any future solution to this should still not be Googles problem. If people are going to be able to have history removed from the net, they should have to have it removed from the net and not just from an index. To cover a lot of this, make some restrictions on distribution of criminal, tax, employment, birth/death records. We already have it in the US for medical records. Some of that information is public records, but why do those have to be free and/or indexed by search engines?
So how do you feel about internet privacy as a whole? I see a post like this and my reaction is, "I don't want everything I do to be public, intended or not" and in that case, I do want privacy. I know if I post a comment on a site, it doesn't go away. But at the same time, I don't want every single page, time, location, and overall history of my life accessible to large corporations. How do you remedy that with the current technical entrenchment in the economy and how it relates to eroding the choice of privacy from individuals? Pure anonymity and pure open-data for all will never work, so where is the balance?
You don't want to, but if someone strong enough gets interested in you, they can. You want privacy. I want privacy. But we don't and won't have any. Only very few and strong can afford that. "Protect privacy" has come to actually mean, protect the strong. We should abolish privacy for all. Not just the de facto, for us.
Not only that, but why is it reasonable to remove the choice of privacy? That's the part that concerns me the most about these discussions, is the way "the internet" should either be all one way, or the complete opposite. That will never be realistic and I feel we should be focusing on the balancing act that works the best for the majority of use-cases.
How would you feel if there was a law that forced you to "forget" anything on request? I mean, did you get explicit consent to see and hear everything you saw and heard? Maybe this wouldn't be a big deal if you get 1-2 requests a year (assuming that forgetting on demand is possible), but what happens when you get millions?
Collecting data is easier than ever before. A lot of small businesses and individuals will start doing it. Can we expect all of them to be able to fulfill these kinds of requests? I'm sure it costs Google millions. This is not reasonable to expect this from anyone or anything.
Knowledge is inherently good. This is not what we should fight against. When bad things happen, the solution is to reduce incentives.
To the 3+ people downvoting me, please explain how I am not adding to the conversation / why you are downvoting.
It's the law of nature, human memory is not perfect.
I can not walk down a street and instantly run facial recognition on everyone, crosslink their FB/Twitter/etc accounts, analyze their text patterns, do NLP on their attitude for recent posts (to assess likely interaction responses), find their social graph of friends, interests, recent purchases, and so on ... as is, that is something that is unique to computers. I mean, it's like you've never heard of the Chilling Effect [1] or think that it's not a big deal, is that the case?
You still have not posted your full browser history from the previous post, why not? How can I be sure you're not trying to subvert the government, HN, or my opinion if you're not fully transparent with your entire life? As you said, "The effect of any "right to privacy" is coercive ignorance. It only really benefits dishonesty and secrecy." ... So again, if knowledge is good and so is transparency, I don't understand why you haven't posted your full browsing history.
In case you're wondering, I did not downvote you (I don't have enough points). But I will reply anyway.
I don't need to have all these "skills" to gain enough information about someone to be able to destroy its life. All I need is to witness something, and tell someone. Is it illegal to witness your male colleague wear a dress and tell others? Is it illegal to tell a friend he's being cheated on? Is it illegal to tell a kid Santa Claus doesn't exist? Should it?
The skills/tools you mention above don't have a qualitative effect on spreading knowledge. They just increase it quantitatively. My eyes won't catch so many things, but they still can catch enough information to have the same effect. And unless you argue that I should need explicit consent before I acknowledge and share anything I hear and see about a person, then I don't think it's reasonable to ban or limit access to tools that increase knowledge and transparency.
The chilling effect is very real. But is it really a bad thing?
I reached the same conclusion. Privacy is unsustainable, and it's unreasonable to expect it going forward.
It's only when you realize and accept this that you start seeing all the actual benefits of transparency.
The effect of any "right to privacy" is coercive ignorance. It only really benefits dishonesty and secrecy. Like any habit, it won't be easy to lose. But privacy is such a bad habit that we should seriously consider to tackle it.
As always, the first step is acceptance and recognition.
Just because he argues that privacy is unsustainable/sub-optimal doesn't mean that he has to facilitate its erosion. It is similar to how someone who demands privacy doesn't need to be prove his anonymity.
He is actively saying that privacy is a bad habit, but yet he won't demonstrate that he believes what he says by posting his history. Heck, all he has to do is screenshot (or copy/paste) everything from https://history.google.com/history to prove he's not "dishonest". And if "knowledge is inherently good" then it would be inherently good to know that he is not actively trying to subvert the government by browsing certain sites or reading certain books. It reads like someone drunk on idealism that believes information will never be distorted, used against them, misapplied, interjected with false data, and so on.
"doesn't mean that he has to facilitate its erosion"
It's not facilitating the erosion, it's adding evidence that what he says is backed up by fact, starting with himself. If he does not believe he should follow his own ideals, does he really believe them?
I cannot be sure, but I think he as arguing about it at societal level. This is my interpretation:
Given the technological improvements, privacy is unsustainable. If everyone realizes this and accepts it, we can reap the benefits of transparency. In a society that does not hold bad actions against those who already paid their due fine (or served the due time). Neither does that future society discriminate against people who are different. In that society, privacy will probably be beneficial only to the dishonest.
Now, I personally don't think if we can reach that stage of maturity, not anytime soon at least. But I think we are moving towards it faster than before and we are doing that because of the easy access to information especially the kind of information that we would otherwise keep private.
Right, I think that too and for the most part I agree. I have issue with the lack of those advocating for social-transparency to describe where the line is. Without having the serious conversation of where privacy and transparency are reasonably balanced, we're rushing into irrational decisions that have a long and lasting impact on society.
I have yet to see a single person who advocates for full transparency post their entire browser and search history, which to me feels odd. If they're advocating for full transparency and yet won't demonstrate that they themselves believe in the ideal enough to offer their own evidence and admission, I don't think they actually believe in those ideals fully, or more likely, have not articulated/discussed/discovered an appropriate middle-ground between these 2 concepts.
As much as I'd like not to have to hide anythings, society currently gives me no choice.
With the way society is currently designed, full personal transparency is pretty much social suicide.
Society was designed with some basic expectations of privacy. For example, we rely a lot on passwords and secrets to "prove" ownership and identity. Full transparencies implies that all my private keys, all my passwords, my credit card number, everything that is meant to be private is available to anyone. In a society that relies on no such concept of private knowledge, we wouldn't have this problem. But that's not the case right now.
The example above is just one reason why people shouldn't surrender their privacy just yet, and wait until the environment and context make it convenient. That said, we won't reach that state if we don't realize and accept that privacy is not we should rely on in the future. There are steps to take to make transparency practical and fair, and solutions like the ones mentioned in the above article are not any of them.
It's because they are made by people who don't fully understand the internet. On the other hand most of the people who do understand it are against any regulation of it and would not help in the creation of sane laws.
That's a part of it. Unfortunately, many people who understand the internet still persist in imagining that what happens on the internet is somehow less real or less consequential than what happens in "real life". In reality the internet is a part of real life, just like movies, or literature, or the telephone, it's not a separate or different place.
I think "Right to be forgotten" should be forced upon the media, not search engines. If the media didn't have the information on their website, then major search engines wouldn't index it.
>I think "Right to be forgotten" should be forced upon the media
Awesome idea. "Well, Mr. news agency guy, the president wants this whole 'tax misappropriation scandal' thing to be forgotten. Otherwise, you may run into some legal trouble. After all, it's his right."
Our concern is that the practical aspects work for individuals, so we do not want to set expectations which cannot be met. Going beyond the idea of a search engine, if information is so proliferated on the internet, how would it be practical to remove all that information? As a regulator, we only want to enforce things in a way where we can achieve the end results…
We rather doubt whether the Government believe there are "ways to make this practical and workable".
Nice to see some common sense.