Hacker News new | past | comments | ask | show | jobs | submit login

It's worth noting that the United States considers a right to keep and bear arms as important as a right to free speech.

Internationally, reasonable disagreement on rights seen by some as fundamental is to be expected.




You make a fair point, but the right to bear arms is mainly controversial for safety reasons. What safety issues are there with providing customers control (or at least visibility) over their data?

It's also worth pointing out that the right to bear arms, by its origin, should probably be called the "right to revolt". While this is still a controversial issue for governments (governments don't want revolt), it's less controversial for citizens.


I think the safety issues with GDPR compliance are minimal (there's a weak argument to be made for inefficiency introduction and contributing to warning blindness, but it is a weak argument).

Larger arguments are in the space of tradeoffs. What could companies be doing with the engineering resources devoted to GDPR compliance (including compliance with the consumer-frustrating applications of the law, like the cookie walls)? Since the US doesn't have a GDPR compliance law (and companies in the US only comply when they want to do business in the EU), we'll probably see with time whether the inefficiency introduced is worth the tradeoffs in a highly-competitive world of software services.


I agree with you that the whole thing is inefficient. I think every GDPR advocate, much like me, will agree that it still needs to be improved and worked on. We're nowhere done.


There is a safety issue in that data previously only visible internally is now also exposed to the customer/user and any unauthorized person successfully pretending to be them. The problem of "account compromised" now becomes a bigger (potentially much bigger) problem of "account compromised and juicy data is exfiltrated under a GDPR data dump request (and maybe followed by a request for deletion right after maybe making it so that the authorized user can't even know what was taken)".


That seems like a huge reach. Most of the data that affects users can be one way or another acquired if you're logged in. Furthermore GDPR requests are often handled outside the account itself, except by companies that have the resources to automate them (and those companies usually have a lot of security resources).

I'll give you though that the addition of human processes in there present more security risks. I'm doubtful about the addition of safety risks though.

I dunno, this all seems like an extension of the risks we already have. More data, what of it? If an account with sensitive data is compromised, you're most likely fucked regardless of whether the hacker gets a hold of that data.


It is kind of minor but still necessary to think about. The difference is that "sensitive data" now includes internal data that not even the account had access to or was aware of. You're right it's an extension of current risks rather than a new class, it's only one more attack vector against an existing surface. Before GDPR, the company itself would need to get compromised for that data to be exposed, and that happens often enough. After GDPR, the data for an individual can get exposed just from compromising that individual. Does that data matter? Maybe, maybe not, that's the question with all breaches too. In the good-spirited intention world, post-GDPR the company no longer gathers as much data in the first place, so they can actually reduce/eliminate the impact of that attack surface being broken in either way. I don't see this happening in practice, the desire for more and more data is strong.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: