Hacker News new | past | comments | ask | show | jobs | submit login

> violated user privacy since the beginning

But why is there an expectation of privacy on the web, anyway? When browsing the web you're utilizing hundreds or thousands of other peoples' infrastructure, going to sites that are public or semi-public (aka "public spaces"), and otherwise interacting with strangers in shared spaces.

In the physical world, there is no common expectation of privacy in public spaces. If you go to a shopping mall, you may be observed by either the mall itself or its client stores, and this is not a "violation of your privacy". You elected to go to a public space, knowing you might or would be observed by any other occupant of that space. If you go to your friend's house, you may be observed by their security cameras, if they have them. This is also not really a violation of your privacy. You could also have a conversation with your friend about not being on their cameras, and maybe come to an agreement--presumably because they trust you. This isn't something that scales to hundreds of strangers, much less thousands or millions.

I think the core issue I see with the stance you appear to hold is this: privacy is not a fundamental feature of anything. Any interaction with another person involves the exchange of information, and the amount of information exchanged is neither fair nor necessarily finite. How much information is exchanged is not just a function of the purpose of the interaction. It's also determined by the abilities of all parties to observe and analyze the interaction. Trying to argue that privacy should be a fundamental right seems to ignore the obvious: that forcing any particular level of privacy requires making sacrifices that significantly impede the quality of interactions, especially those involving parties that do not necessarily trust each other.

If you care about privacy, interact with other privacy-minded individuals and establish a common set of norms amongst yourselves, and acknowledge that those norms will be different for every group. This is sane and defensible.

What isn't is demanding that the world adhere to your standards of privacy, while claiming that it's some sort of right that should be protected by someone other than yourself.

Understand that privacy cuts both ways: most of the world is apparently fine with having no expectation of privacy on public places--online or offline. They're okay with this because it makes antisocial behavior harder to hide, and makes pro-social behavior easier to verify. If everyone demanded there be an expectation of privacy in those public places, the world would be very different, and not in a good way. Distrust would be the norm, and cooperative endeavors would be bogged down by the endless need to make sure those you're working with are indeed who they say they are.

If you want good examples of what it looks like when privacy in public spaces is the norm on a large scale, look no further than the cryptocurrency world. Not exactly a shining example of a "pro-social culture".




> In the physical world, there is no common expectation of privacy in public spaces.

Oh, please. This is a false equivalence if I ever read one.

When you're in public nobody is building a profile on you. Nobody is tracking where you go, what you buy, how you behave, and what your interests are. Nobody is then storing this data and getting rich from it in perpetuity.

If privacy online were limited to companies knowing just your IP address and general location, and not storing this data, _that_ would be equivalent to public spaces in the real world. The fact it's not, and companies actively try to gather as much information they can about you, often in scammy ways that forces governments to come up with new regulations, where getting fined for violating them is just the cost of doing business... is beyond nefarious and any reasonable defense.

> privacy is not a fundamental feature of anything

Huh? Privacy has been a fundamental human right for millennia, codified as part of many legal jurisdictions.

> Any interaction with another person involves the exchange of information, and the amount of information exchanged is neither fair nor necessarily finite.

Again, the difference is in how this information is used. The keywords here are _reach_ and _consent_. We implicitly consent to sharing a limited set of information when interacting with another person or in public, but when that information and its use goes far beyond what we implicitly agree to, then this needs to be clarified, and explicit consent must be given.

> Understand that privacy cuts both ways: most of the world is apparently fine with having no expectation of privacy on public places--online or offline.

This boils down to two things: a) most of the world doesn't understand the implications of the transaction they're a part of when they use online services. The use of their data is hidden behind cryptic privacy policies (which are often violated and don't explain the full picture), and data collection happens in obscure ways without any consent. And b) even if they're aware of this, most users simply accept the transaction because "they have nothing to hide", or the value they get from the service is greater than their privacy concerns. Users are often never given the option of an alternative business model, and even when they are, the data collection machine never stops. Companies know that the allure of getting something for "free" is far more profitable than putting up a paywall, and so they justify extracting as much value as they can from their users' data. We're stuck in an endless cycle of having "free" services where the business transaction is far more valuable to the company than its users. And it's in the companies' best interest to keep the users, the general public and governments in the dark as far as the extent and details of how value is actually extracted. This is the most vile business model we've ever invented.

> If you want good examples of what it looks like when privacy in public spaces is the norm on a large scale, look no further than the cryptocurrency world.

What privacy? The blockchain of most cryptocurrencies is a public ledger where all transactions are visible to everyone. Transactions are done almost exclusively via exchanges that follow strict KYC regulations. So it's pretty easy to identify the person behind any transaction.

Following your logic, should we ban all cash transactions, since they're even more private than cryptocurrencies?

You have a twisted sense of the world, friend. My guess would be that you work for one of these companies, and this is how you rationalize the harm you're putting out in the world.


> Oh, please ... is beyond nefarious and any reasonable defense.

I don't know. This seems overly reductive. People build profiles on you all the time, just by looking at you. What you wear, how you talk, how you hold yourself--these things all present a large amount of information to anyone willing and able to observe it.

Companies invest large amounts of money into understanding customer movements in their stores, to better optimize product placement and things like checkout flow. Even sole proprietors will observe their customers' patterns to get a better understanding of how their store is performing. If a worker in a cafe sees the same person come in multiple times and order the same drink each time, is that not also "building a profile" on said customer?

A public IP address and a general location is what... maybe 12 bytes of information? You broadcast a lot more than 12 bytes of information when you pass through a public place, especially when you consider all the possible forms of information you could be broadcasting. You're being overly reductive to try to make your point, and I don't really buy it.

> Huh? Privacy has been a fundamental human right for millennia, codified as part of many legal jurisdictions.

Being "codified as part of many legal jurisdictions" does not make something a "fundamental right". To take a slightly extreme example, treating women as property is "codified as part of many legal jurisdictions" in the world today, and yet it would be insane to use that to argue that "treating women as property is a fundamental human right".

You may believe that privacy is a fundamental human right, but it's not because it's been codified in "many legal jurisdictions". The logic just doesn't follow.

> Again, the difference is in how this information is used. The keywords here are _reach_ and _consent_. We implicitly consent to sharing a limited set of information when interacting with another person or in public, but when that information and its use goes far beyond what we implicitly agree to, then this needs to be clarified, and explicit consent must be given.

When did "reach" and "consent" come into this? Why are these "keywords" here? Merely _existing_ in the world is consenting to sharing "private" information with other people (or even things!) in the world. It's unavoidable. I'd argue further that it's not just avoidable, but _necessary_ to the building of a functioning society--especially any when governed by any form of democracy. It's not a matter of consent, it's merely a fact of life.

You don't explicitly consent to any level of information sharing when you go out into the world, but some amount of it is unavoidable. In fact, I'd even go so far as to argue that it's impossible for you to ever know how much information you share when you venture into the world. You can never know how much someone else can deduce about you from any given interaction, because that would requiring knowing what only another person can know. You also can never know how far that information might spread--because again, this would require knowing others' intent and intentions _before ever interacting with them_, which is impossible.

> This boils down to two things: a) most of the world doesn't understand the implications of the transaction they're a part of when they use online services. The use of their data is hidden behind cryptic privacy policies (which are often violated and don't explain the full picture), and data collection happens in obscure ways without any consent.

I would largely argree with this, because I believe this is how "public spaces" already function. Data collection always happens without your consent. If I pass you on the street and note that you're wearing an Apple Watch, I have obtained information about you that you did not (explicitly) consent to share with me. I can draw reasonable conclusions from that information that might inform me of other aspects of your life that you also did not (explicitly) consent to share with me. This is how public spaces work!

> And b) even if they're aware of this, most users simply accept the transaction because "they have nothing to hide", or the value they get from the service is greater than their privacy concerns.

Again, also agree with this, because I think it's simply true. The "data" that you're sharing with a lot of these companies would be worthless to you yourself (because the value comes from correlations across many data points). The fact that it has value _to a given company_ when it might be worthless to you yourself is part of what gives that company's services value--even to you.

As an example (in an area I'm familiar with), if no company ever collected information about how their website was used (i.e. with a product like Datadog or Sentry), they would be faced with a much larger cost to improve their business. The testing burden would be far higher, they would have no insight into whether or not their product actually works for real users, they would only receive a fraction of the number of bug reports they would receive through automation, etc etc. These are all things that reduce the cost the company has to pay in terms of keeping the product in working order and increasing its feature set. These services exist for _very good reasons_, and they're pervasive because they provide a large amount of value to their customers. Which kind of gets to your next point:

> Users are often never given the option of an alternative business model, and even when they are, the data collection machine never stops.

Users are given the option of alternative business models all the time. I've lost count of how many "privacy-preserving" services I've seen that promise to never collect your data, or never share your data, or whatever. I've also lost count of how many of them have died a slow death, because a) people don't care about that as a value proposition, b) they don't improve their services as quickly as their competitors, c) they're more costly than their competitors, and d) they just suck as services because they fail at basic usability.

There are significant upsides for _all parties_ when it comes to certain kinds of data collection (i.e. telemetry data mentioned above, flow analysis in stores, etc). These things come with no downside other than some bogeyman of "oh noes they have my data".

> Companies know that the allure of getting something for "free" is far more profitable than putting up a paywall, and so they justify extracting as much value as they can from their users' data. We're stuck in an endless cycle of having "free" services where the business transaction is far more valuable to the company than its users.

Again, fully agree, but I also understand _why_ this is the case. There are _significant_ upsides to many kinds of data collection, and those in favor of "enhanced privacy" to this day _cannot_ provide viable alternatives. The fact that it's extremely hard to build a service on a paid subscription that does not subsidize its development/research/feature set through data collection of some kind should be ample evidence that this is the case.

> And it's in the companies' best interest to keep the users, the general public and governments in the dark as far as the extent and details of how value is actually extracted. This is the most vile business model we've ever invented.

Kind of agree with this, except the last bit. I don't see why it's vile at all. Certainly it is in some cases (see the general trend of enshittification), but I don't think the only answer is "make everything _more_ private". That only increases distrust, which is the last thing we need right now.

> What privacy? The blockchain of most cryptocurrencies is a public ledger where all transactions are visible to everyone. Transactions are done almost exclusively via exchanges that follow strict KYC regulations. So it's pretty easy to identify the person behind any transaction.

Okay, here's a Bitcoin transaction[1]. Please identify who the participants are.

> Following your logic, should we ban all cash transactions, since they're even more private than cryptocurrencies?

No, why would we do that? That doesn't follow from pointing at cryptocurrency as a place where minimization of trust and maximization of privacy are the norms. This just seems like you're throwing out whatever other bad things you can attach to the strawman you've constructed from my argument.

> You have a twisted sense of the world, friend. My guess would be that you work for one of these companies, and this is how you rationalize the harm you're putting out in the world.

Wildly incorrect, as well.

But here's the thing... I'm not the one saying "when I go out in public, all people can get from me is the equivalent of a public IP address and a general location". I'm trying to point out that when I interact with the world, there's depth to the interactions that I literally _cannot_ know, and that I'm sharing information with the world that I'm not even aware that I'm sharing.

I'd argue that taking the world of public spaces filled with (very real) people who have as much depth (or more) to their lives than your own, and viewing that as "just a bunch of public IP addresses and general location information" is the truly twisted view of the world. People aren't cartoon cutouts, and places full of real people are also full of a wealth of information that is essential to the functioning of our societies.

The answer isn't to take all that information away--it's to be better at using and utilizing it, for whatever definition of "better" we can all manage to agree on.


I won't reply to all your points, but I'll say this: I don't think I'm being overly reductive as much as you're being dismissive about the lengths tech companies go to build a profile of their users. The major difference between someone building a mental profile based on my interaction with them in the real world and the online profile companies build based on my digital footprint is that the real world profile is not stored perpetually and shared, or rather sold, behind my back to countless interested parties that will later use it to psychologically manipulate me, whether that's to get me to buy something I didn't need, or to change my mind about a political or social topic. Additionally, my real world profile is helpful in building tangible relationships, and I have full control over what I choose to share with the world. I may not be aware of all the information I'm sharing, but I'm certainly in control of it.

This is what privacy essentially is. Consent and reach _are_ keywords, since I get to decide what and how much I share publicly, and what I keep private. I can at any point stop interacting with whoever I want, at which point they only have a fading mental profile of me. This is not the case with digital profiles, since companies use dirty tricks to access as much of my data as possible, routinely exceed their boundaries, and, most crucially, they keep this data and continue profiting from it in perpetuity. Even principles like the right to be forgotten of regulations like the GDPR are not sufficient to put an end to this, since once my profile is sold on the data broker market, it is forever part of it, and will likely end up back in the hands of the company that claims to have deleted it.

This is why I maintain that these practices are vile and nefarious, and why we have far too few and far too lenient regulations to control them.

Re: tracing the Bitcoin transaction, I don't claim to be able to do this personally, just that it's technically possible. There are companies like Chainalysis that do this kind of service, and there are many examples of law enforcement agencies tracking individuals using these techniques. I'm only pointing out the flaw in your argument that cryptocurrencies are some kind of privacy standard, and extrapolating it to cash money, which would presumably foster even less of a shining example of a "pro-social culture".

In any case, I've exhausted my arguments on this topic, and it's clear we have very different views about privacy, so let's agree to disagree. Cheers!


> Consent and reach _are_ keywords, since I get to decide what and how much I share publicly, and what I keep private. I can at any point stop interacting with whoever I want, at which point they only have a fading mental profile of me.

I'll just point out that I think this is where I find the flaw in your argument: You _don't_ get to decide what and how much you share in public. You might try, but you will never succeed, because it's impossible to not share what you don't realize you're sharing. I think this is why I find your argument entirely unconvincing: if you're incapable of knowing the extent of what you're sharing with other people in public, informed consent us impossible by definition. And if you don't know the extent of what you're sharing, how can you know how far that information can reach?

Sure, being online makes collection and archival of what you share easier, but "more privacy" doesn't address the root cause, because the root cause is that you cannot be fully aware of what context you share with others in the world--both online and off. I don't think this is a problem, because I think this kind of unintentional information sharing provides all kinds of benefits to society at large. It's the kind of thing that drives "gut feelings" and other forms of intuition, and we'd be impoverished if everyone adopted the kind privacy maximizing standpoint you're advocating above.

But opposing viewpoints are good, so I'm not super concerned if I fail to convince you of anything, really. Cheers!




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: