Hacker News new | past | comments | ask | show | jobs | submit login

>Not only has communication become easier, so has surveillance.

It is a devil's advocate response, but can you explain why this is a bad thing? If communication is scaling and becoming easier, why shouldn't surveillance?




1. We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.

2. I am a law abiding, innocent citizen. Why should I have to face the same compromised privacy as a criminal? It used to be that only people under suspicion are surveilled and that a judge had to grant this based on preliminary evidence. Now everyone is surveilled. How long until people who are sidestepping surveillance (i.e. using Open Source systems that don't implement this stuff), fall under suspicion just because they are not surveilled? How long until it's "guilty until proven otherwise"?

In my opinion it is absolutely fair and necessary to scan images that are uploaded to the cloud in a way that makes them shareable. But never the stuff I have on my device. And the scanning system has to be transparent.


>We judge these programs under the wrong assumption that it is run by the good guys

Speaking as someone who as a boy was tortured and trafficked by operatives of the Central Intelligence Agency of the United States of America, I am surprized how little appreciation there is for the standard role of misdirection in the espionage playbook.

It is inevitable that all these obstensibly well-intended investigators will have their ranks and their leadership infiltrated by intelligence agencies of major powers, all of whom invest in child-trafficking. What better cover?


"Wont somebody think of the (white, wealthy, documented) chiiiiilren!"

:sigh:


> We judge these programs under the wrong assumption that it is run by the good guys. Any system that depends on the goodness of the people running it is dangerous, because it can be taken over by the bad guys.

And sometimes, the "good guys" in their attempts to be as good as they can imagine being, turn into the sort of person who'll look a supreme court judge or Congresspeople or oversight committees in the eye and claim "It's not 'surveillance' until a human looks at it." after having built PRISM "to gather and store enormous quantities of users’ communications held by internet companies such as Google, Apple, Microsoft, and Facebook."

https://www.hrw.org/news/2017/09/14/q-us-warrantless-surveil...

Sure, we copied and stored your email archive and contact lists and instant messenger history. But we weren't "surveilling you" unless we _looked_ at it!

Who are these "good guys"? The intelligence community? The executive branch? The judicial branch? Because they're _all_ complicit in that piece of deceit about your privacy and rights.


Surveillance does have its place, but a large part of the problem with the new technology of surveillance is the people passing the laws on surveillance don't understand the technology that surrounds it.

Take, for instance, the collection of metadata that is now so freely swept up by the American government without a warrant. This includes the people involved in communication, the method of communication, time and duration of communication and locations of those involved with said communication. All of that is involved in the metadata that can be collected without a warrant by national agencies for "national security" done on a massive scale under PRISM (PRIZM?).

Now, this is non targeted, national surveillance, fishing for a "bad guy" with enhanced surveillance capabilities. This doesn't necessarily seem like a good thing. It seems like a lazy thing, and a thing which was ruled constitutional by people who chose to not understand the technology because it was easier than thinking down stream at the implications.


> is the people passing the laws on surveillance don't understand the technology that surrounds it.

And that the people running the surveillance have a rich track record of lying to the people who are considering whether to pass the laws they're proposing.

"Oh no, we would _NEVER_ surveil American citizens using these capabilities!"

"Oh yeah, except for all the mistakes we make."

"No - that's not 'surveillance', it's only metadata, not data. All we did was bulk collect call records of every American, we didn't 'surveil" them."

"Yeah, PRISM collects well over 80% of all email sent by Americans, but we only _read_ it if it matches a search we do across it. It's not 'surveillance'."

But they've stopped doing all that, right? And they totally haven't just shared that same work out amongst their five eyes counterparts so that what each of them is doing is legal in their jurisdiction even though there are strong laws preventing each of them from doing it domestically.

And how would we even know? Without Snowden we wouldn't know most of what we know about what they've been doing. And look at the thanks get got for that...


> It is a devil's advocate response, but can you explain why this is a bad thing?

I didn't state it as a bad thing, I stated it as a counter to your argument that encrypted communication is more common, and therefore maybe we should assess whether additional capabilities are warranted. Those capabilities already exist, and expanded before the increased encrypted communication (they happened during the analog phone line era). I would hazard that increasingly encrypted communication is really just people responding to that change in the status quo (or overreach, depending on how you view it) brought about by the major powers (whether they be governmental or corporate).


Why shouldn't every Neuralink come with a mandated FBI module preinstalled? Where, if anywhere, is private life to remain, where citizens think and communicate freely? Is Xinjiang just the start for the whole world?


Surely there is communication surrounding the child sexual abuse, making plans with other pedophiles and discussing strategies. Some of this may occur over text message. Maybe Apple’s next iteration of this technology can use my $1000 phone that I bought and that I own to surveil my E2EE private encrypted messages on my phone and generate a safety voucher for anything I say? The sky’s the limit!


Good point: neural nets are getting better at natural language every year.


Do you see how this underlines my original point? I brought up real child abuse that is happening today and your response is about brain implants being mandated by the government. Most people are going prioritize the tangible problem over worrying about your sci-fi dystopia.


A plausible story of how our rights are in the way is always ready to hand. If we can't at some point draw a clear line and say no, that's it, stop -- then we have no rights. It's chisel, chisel, chisel, year after year, decade after decade.

In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.


>In America one of those lines is that your personal papers are private. Get a warrant. I don't have to justify this stand. I might choose to explain why it's a good stand, or I might not; it's on you to persuade us.

Part of the problem is that these devices are encrypted so a warrant doesn't work on them. That is a big enough change that maybe people need to debate if the line is still in the right place.


That is a change worth considering, though it must be treated at the level of rights, not just a case-by-case utility calculus. At the same time, most other changes have been towards more surveillance and control: cameras everywhere, even in the sky; ubiquitous location tracking; rapidly improving AI to scale up these capabilities beyond what teams of humans could monitor; tracking of most payments; mass warrantless surveillance by spy agencies; God knows what else now, many years after the Snowden leaks. This talk you hear about the population "going dark" is... selective.

I think my vehemence last night might've obscured the point I wanted to make: what a right is supposed to be is a principle that overrides case-by-case utility analysis. I would agree that everything is open to questioning, including the right to privacy -- but as I see it, if you ask what's the object-level balance of utilities with respect to this particular proposal, explicitly dropping that larger context of privacy as a right (which was not arrived at for no reason) and denigrate that concern as science fiction, as a slippery-slope fallacy -- then that's a debate that should be rejected on its premise.


My memories and thoughts are also warrant-proof. We just accept it as a limitation.


20 years ago a device 24*7 inspecting personal data of 60% citizens for samples remotely provided by government was sci-fi dystopia.


To whom do you give the keys?


Post your passwords.


That's too short a thought, it detracts from the point.

It is bad to post passwords, not just because you lose privacy, but because you'd lose control of important accounts. Asking people to post their passwords is not reasonable.

I think you might have a point you're trying to make, but please spell it out fully.


No actually I kind of like it. It boils down the essence of the problem to it's core point-- trust.


Post your home address and phone number.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: